The fourth MEDIATE workshop will be held on June 5, as part of the International AAAI Conference on Web and Social Media (ICWSM). The main goal of the workshop is to bring together media practitioners and technologists to discuss new opportunities and obstacles that arise in the modern era of information diffusion. This year's theme is: Misinformation: automated journalism, explainable and multi-modal verification and content moderation. You can also check the program and recorded talks of the 2020, 2021 and 2022 editions of MEDIATE.
There was a lot of fact-based information and misinformation in the online discourses and discussions about the COVID-19 vaccines. Using a sample of nearly four million geotagged English tweets and the data from the CDC COVID Data Tracker, we conducted the Fama-MacBeth regression with the Newey-West adjustment to understand the influence of both misinformation and fact-based news on Twitter on the COVID-19 vaccine uptake in the U.S. from April 2019 when U.S. adults were vaccine eligible to June 2021, after controlling state-level factors such as demographics, education, and the pandemic severity. The negative association between the percentage of fact-related users and the vaccination rate might be due to a combination of a larger user-level influence and the negative impact of online social endorsement on vaccination intent.
With the increase in scale and diffusion of online misinformation, efforts to develop scalable technological systems for fact-checking online information have also increased. However, such systems are limited in practice because their system design often does not take into account how fact-checking is done in the real world, and they ignore the insights and needs of various stakeholder groups core to the fact-checking process. In this talk, I will unpack this fact-checking process by revealing the infrastructures---both human and technological---that support and shape fact-checking work, the primary stakeholders involved in this process, the collaborative effort among them and the associated technological and informational infrastructures, and the key social and technical needs and challenges faced by each stakeholder group. Finally, I will close by previewing a system (YouCred) that was designed and built through 1.5 years of collaboration with key stakeholders of Africa’s largest indigenous fact-checking organization (Pesacheck) to assist fact-checkers with misinformation discovery and credibility assessments on one of the largest video search platforms--YouTube.
Misinformation, hate speech, and polarization are not limited to large online platforms alone. The rise of alternative websites and community apps has led to an increasing number of users encountering the same issues as those found on the so-called Big Tech platforms. Fortunately, there is a range of tools available, including AI flagging systems, shared resources, and free APIs, that can be effectively utilized to protect online platforms. In this presentation, we will shed light on the challenges and opportunities presented by the growing number of online platforms. We will also delve into best practices, identify gaps, and explore ongoing initiatives aimed at enabling small and medium-sized organizations to ensure the safety of their users. Specifically, we will focus on addressing problems related to misinformation, hate speech, and polarization.
Topics of interest include, but are not limited to:
We invite submissions of technical papers and talk proposals:
Important Dates (all deadlines are 23:59, AoE):