7 datasets found
  1. Can professional fact-checkers' techniques advance users' understanding of...

    • osf.io
    url
    Updated Apr 3, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Folco Panizza; Piero Ronzani; Carlo Martini; Simone Mattavelli; Tiffany Morisseau (2024). Can professional fact-checkers' techniques advance users' understanding of scientific content on social media? [Dataset]. http://doi.org/10.17605/OSF.IO/GSU9J
    Explore at:
    urlAvailable download formats
    Dataset updated
    Apr 3, 2024
    Dataset provided by
    Center for Open Sciencehttps://cos.io/
    Authors
    Folco Panizza; Piero Ronzani; Carlo Martini; Simone Mattavelli; Tiffany Morisseau
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The fight against misinformation and disinformation has developed substantially over the last decade, from the evolution of debiasing techniques (Lewandowsky, Ecker, Seifert, Schwarz & Cook, 2012) to the implementation of preventive interventions such as inoculation (McGuire, 1964; Roozenbeek, van der Linden, & Nygren, 2020) and news literacy (Vraga & Bode, 2020; Guess et al., 2020; Lutzke et al., 2019).

    At the same time, attention has shifted markedly towards the dissemination of false information on social media, resulting in the adaptation of these interventions to these rapidly changing environments. Examples of social media interventions include the use of warnings or flags for unverified or debunked information (Clayton et al., 2019), the wisdom of crowds (Allen, Arechar, Pennycook & Rand, 2020) and attention priming (Pennycook, Epstein, Mosleh, Arechar, Eckes & Rand, 2019).

    Recently, a new type of media literacy intervention, civic online reasoning (Wineburg & McGrew, 2017), has proven very effective in countering disinformation among high school and college students (McGrew, Breakstone, Ortega, Smith & Wineburg, 2018; McGrew, Smith, Breakstone, Ortega & Wineburg, 2019), as well as elderly citizens (Moore & Hancock, 2021).

    This intervention, based on learning professional fact-checking techniques, has the advantage that it can be used when misinformation is deceptively sophisticated or difficult to detect. Despite extensive study of civic online reasoning offline, there has been little application of these techniques on social media.

    The present study aims to adapt to social media some of the strategies of civic online reasoning, specifically lateral reading (leaving a website and opening of new tabs along a horizontal axis in order to use the resources of the Internet to learn more about a site and its claims) and click restraint (skipping the first search results of a browser search ). We test these strategies against scientific misinformation, which is a perfect testing ground for online interventions, as scientific content is often difficult for laypeople to access.

    As a complementary hypothesis, we test whether evaluation accuracy can be increased by giving participants monetary rewards for correct answers. This incentivized condition will serve as a benchmark for fact checking techniques. In addition, the incentives will allow us to test whether motivation and attention, triggered by the presence of an incentive, are sufficient to increase accurate responses.

    Participants are shown an interactive Facebook post, which links to an article presenting information related to science, and are asked to rate how scientifically valid the content of the post is.

    The scientific validity of the posts is determined through a predefined procedure based on a series of quality checks, such as whether the scientific claims have been peer-reviewed, whether the authors making the claim are competent on the specific topic, and whether the media reporting is faithful to the original article.

    The post will present a title in form of a scientific claim, and a caption from the article that elaborates on that claim. Participants are randomly assigned to one of three experimental conditions. In the control condition, participants observe and interact with the Facebook post, and are asked to rate the scientific validity of the post on a scale from 1 to 6. The monetary incentive condition is identical to the control condition, with the difference that participants are paid a fixed sum if their answer is correct. Finally, the pop-up condition is identical to the control condition, with the difference that before reading the post participants also observe a pop-up suggesting civic online reasoning strategies such as lateral reading and click restraint. At the end of the experiment, after the participants have completed a series of control questionnaires, we debrief them on the scientific validity of the post.

    In summary, the study tests whether fact checking techniques, as opposed to monetary incentives, are effective tools for understanding the scientific validity of social media content.

  2. u

    Data from: The triangle of polarization, political trust and political...

    • produccioncientifica.usal.es
    Updated 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bonsfills, Josep Maria Comellas; Bosch, Oriol J.; Loriente, Mariano Torcal; Carty, Emily Bickle; Bonsfills, Josep Maria Comellas; Bosch, Oriol J.; Loriente, Mariano Torcal; Carty, Emily Bickle (2022). The triangle of polarization, political trust and political communication: understanding its dynamics in contemporary democracies (TRI-POL) [Dataset]. https://produccioncientifica.usal.es/documentos/67321c70aea56d4af0483452
    Explore at:
    Dataset updated
    2022
    Authors
    Bonsfills, Josep Maria Comellas; Bosch, Oriol J.; Loriente, Mariano Torcal; Carty, Emily Bickle; Bonsfills, Josep Maria Comellas; Bosch, Oriol J.; Loriente, Mariano Torcal; Carty, Emily Bickle
    Description

    This project is centred on the triangle of interactive relationships between citizens affective political and social polarization, citizens political distrust in the main institutions and actors of political representation, and the politics of party/elite competition, producing problematic dynamics and consequences for the quality, functioning, and even potential survival of liberal contemporary democracies. This study contains an innovative and comprehensive investigation of concepts such as political trust, affective polarization and the politics of party competition and the dynamics and interactions among them in Spain, Portugal, Italy and two Latin American countries (Chile and Argentina). For this goal we compiled new theoretical arguments building up from recent methodological innovations on this topic from leading research in the US, and putting together a team of country experts from different disciplines such as political science, public opinion, political psychology, survey methodology, political communication, data engineers and big data experts. An additional value of this project is the use of a multi-method approach that combines the design and implementation of an original online panel survey with three different waves, with innovative survey questions together with the application of embedded experiments. Furthermore, we want to match the preceding individual-level data with information collected with a passive tracking application (a passive meter), which captures real individual behaviors and exposure to information received via mass electronic and social media. Finally, we use techniques of computer-aided text analysis (CATA), to conduct analysis of the sources of information to which respondents have been exposed.

  3. Methods for verifying information U.S. 2022, by platform

    • statista.com
    Updated Feb 1, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2024). Methods for verifying information U.S. 2022, by platform [Dataset]. https://www.statista.com/statistics/1406264/ways-of-fact-checking-news-us/
    Explore at:
    Dataset updated
    Feb 1, 2024
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Jun 2022
    Area covered
    United States
    Description

    The results of a survey held in summer 2022 found that the main way to verify news was using Google or another search engine. Fact-checking by reading further on the topic or finding information from experts were also popular strategies, especially for news found on social media.

  4. Positive social media discoveries made by online recruiters 2018

    • statista.com
    Updated Feb 29, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Positive social media discoveries made by online recruiters 2018 [Dataset]. https://www.statista.com/topics/2727/online-recruiting/
    Explore at:
    Dataset updated
    Feb 29, 2024
    Dataset provided by
    Statistahttp://statista.com/
    Authors
    Raphael Bohne
    Description

    This statistic gives information on social media discoveries that led to hiring professionals extending an offer to a job candidate. During a survey in May 2018, it was found that 33 percent of respondents who researched candidates on social media extended an offer after finding the candidate's social media profile conveyed a professional image.

  5. Trust in information provided by media channels in France 2015, by social...

    • statista.com
    Updated Feb 23, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Trust in information provided by media channels in France 2015, by social category [Dataset]. https://www.statista.com/statistics/421103/public-trust-in-information-channels-france-socioprofessional-category/
    Explore at:
    Dataset updated
    Feb 23, 2015
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Feb 18, 2015 - Feb 19, 2015
    Area covered
    France
    Description

    This statistic illustrates differences of opinion concerning the credibility of information provided by various sources in France in 2015, depending on respondents' socio-professional category. It reveals that people from upper socio-professional categories trusted information provided by television less than people from lower social categories (eleven points difference).

  6. Main sources for information about electric vehicles Malaysia 2021

    • statista.com
    Updated Oct 5, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2022). Main sources for information about electric vehicles Malaysia 2021 [Dataset]. https://www.statista.com/statistics/1311713/malaysia-sources-for-information-about-electric-vehicles/
    Explore at:
    Dataset updated
    Oct 5, 2022
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Jul 26, 2021 - Aug 18, 2021
    Area covered
    Malaysia
    Description

    According to a survey by Vodus on electric vehicle ownership in Malaysia, among the main sources for information about an electric vehicle, social media was stated as the most important source by 39 percent of respondents. Car magazines and expert opinion were the fourth and fifth most important sources among those who were looking for information regarding electric vehicles.

  7. Number of professional and amateur content creators worldwide in 2020, by...

    • statista.com
    Updated Feb 21, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Statista (2022). Number of professional and amateur content creators worldwide in 2020, by platform [Dataset]. https://www.statista.com/statistics/1268168/number-of-content-creators-worldwide-by-platform/
    Explore at:
    Dataset updated
    Feb 21, 2022
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    2020
    Area covered
    Worldwide
    Description

    Publishing and monetizing original content on social platforms has become easier now than ever before. In 2020, the vast majority of content creators worldwide were amateurs who earned from making content part-time. Meanwhile, approximately 500 thousand professionals earned a living completely from publishing content on Instagram. Instagram was by far the most popular outlet among influencers, with 30 million amateur creators monetizing content on the Facebook-owned platform that year.

  8. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Folco Panizza; Piero Ronzani; Carlo Martini; Simone Mattavelli; Tiffany Morisseau (2024). Can professional fact-checkers' techniques advance users' understanding of scientific content on social media? [Dataset]. http://doi.org/10.17605/OSF.IO/GSU9J
Organization logo

Can professional fact-checkers' techniques advance users' understanding of scientific content on social media?

Explore at:
urlAvailable download formats
Dataset updated
Apr 3, 2024
Dataset provided by
Center for Open Sciencehttps://cos.io/
Authors
Folco Panizza; Piero Ronzani; Carlo Martini; Simone Mattavelli; Tiffany Morisseau
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

The fight against misinformation and disinformation has developed substantially over the last decade, from the evolution of debiasing techniques (Lewandowsky, Ecker, Seifert, Schwarz & Cook, 2012) to the implementation of preventive interventions such as inoculation (McGuire, 1964; Roozenbeek, van der Linden, & Nygren, 2020) and news literacy (Vraga & Bode, 2020; Guess et al., 2020; Lutzke et al., 2019).

At the same time, attention has shifted markedly towards the dissemination of false information on social media, resulting in the adaptation of these interventions to these rapidly changing environments. Examples of social media interventions include the use of warnings or flags for unverified or debunked information (Clayton et al., 2019), the wisdom of crowds (Allen, Arechar, Pennycook & Rand, 2020) and attention priming (Pennycook, Epstein, Mosleh, Arechar, Eckes & Rand, 2019).

Recently, a new type of media literacy intervention, civic online reasoning (Wineburg & McGrew, 2017), has proven very effective in countering disinformation among high school and college students (McGrew, Breakstone, Ortega, Smith & Wineburg, 2018; McGrew, Smith, Breakstone, Ortega & Wineburg, 2019), as well as elderly citizens (Moore & Hancock, 2021).

This intervention, based on learning professional fact-checking techniques, has the advantage that it can be used when misinformation is deceptively sophisticated or difficult to detect. Despite extensive study of civic online reasoning offline, there has been little application of these techniques on social media.

The present study aims to adapt to social media some of the strategies of civic online reasoning, specifically lateral reading (leaving a website and opening of new tabs along a horizontal axis in order to use the resources of the Internet to learn more about a site and its claims) and click restraint (skipping the first search results of a browser search ). We test these strategies against scientific misinformation, which is a perfect testing ground for online interventions, as scientific content is often difficult for laypeople to access.

As a complementary hypothesis, we test whether evaluation accuracy can be increased by giving participants monetary rewards for correct answers. This incentivized condition will serve as a benchmark for fact checking techniques. In addition, the incentives will allow us to test whether motivation and attention, triggered by the presence of an incentive, are sufficient to increase accurate responses.

Participants are shown an interactive Facebook post, which links to an article presenting information related to science, and are asked to rate how scientifically valid the content of the post is.

The scientific validity of the posts is determined through a predefined procedure based on a series of quality checks, such as whether the scientific claims have been peer-reviewed, whether the authors making the claim are competent on the specific topic, and whether the media reporting is faithful to the original article.

The post will present a title in form of a scientific claim, and a caption from the article that elaborates on that claim. Participants are randomly assigned to one of three experimental conditions. In the control condition, participants observe and interact with the Facebook post, and are asked to rate the scientific validity of the post on a scale from 1 to 6. The monetary incentive condition is identical to the control condition, with the difference that participants are paid a fixed sum if their answer is correct. Finally, the pop-up condition is identical to the control condition, with the difference that before reading the post participants also observe a pop-up suggesting civic online reasoning strategies such as lateral reading and click restraint. At the end of the experiment, after the participants have completed a series of control questionnaires, we debrief them on the scientific validity of the post.

In summary, the study tests whether fact checking techniques, as opposed to monetary incentives, are effective tools for understanding the scientific validity of social media content.

Search
Clear search
Close search
Google apps
Main menu