skip to main content
research-article

Reading Between the Lies: A Classification Scheme of Types of Reply to Misinformation in Public Discussion Threads

Published: 14 March 2022 Publication History

Abstract

Online misinformation is a fiendish problem. Demonstrably false information propagates faster and more widely than truth and this has heralded a technological arms race. One possible mechanism for addressing misinformation is social: there is evidence seeing misinformation being challenged can ‘inoculate’ a reader against it. To date, no research has examined how discussions sparked by misinformation play out; What are the different ways in which people reply to posts containing misinformation? How does the discussion flow in each case? Are there differences between platforms? We address these questions through an inductive qualitative analysis of discussion threads on three public discussion platforms (Twitter, YouTube and two news sites) and on three topics (COVID, Brexit and climate change). We present a classification scheme of types of replies to misinformation, and show that replies show different patterns between platforms. Knowing how people reply to posts that contain misinformation enriches our knowledge of ‘human misinformation interaction,’ and provides an understanding of how socio-technical factors in platform design can reduce the risk of misinformation spreading.

References

[1]
Ashcroft, L., 2016. How the United Kingdom voted on Thursday… and why, Lord Ashcroft Polls. https://lordashcroftpolls.com/2016/06/how-the-united-kingdom-voted-and-why/
[2]
Australian Competition and Consumer Commission, 2018. Digital Platforms Inquiry: Preliminary Report. https://www.accc.gov.au/focus-areas/inquiries-finalised/digital-platforms-inquiry-0.
[3]
Ayers, J.W., Caputi, T.L., Nebeker, C., and Dredze, M., 2018. Don't quote me: reverse identification of research participants in social media studies. Digital Medicine 1, 1 (2018/08/02), 30. DOI= http://dx.doi.org/10.1038/s41746-018-0036-2.
[4]
Bakir, V. and McStay, A., 2018. Fake News and The Economy of Emotions. Digital  Journalism 6, 2 (2018/02/07), 154-175. DOI= http://dx.doi.org/10.1080/21670811.2017.1345645.
[5]
Benegal, S.D. and Scruggs, L.A., 2018. Correcting misinformation about climate change: the impact of partisanship in an experimental setting. Climactic Change 148, 1 (2018/05/01), 61-80. DOI= http://dx.doi.org/10.1007/s10584-018-2192-4.
[6]
Bode, L. and Vraga, E.K., 2018. See Something, Say Something: Correction of Global Health Misinformation on Social Media. Health Communication 33, 9 (2018/09/02), 1131-1140. DOI= http://dx.doi.org/10.1080/10410236.2017.1331312.
[7]
Bruns, A., 2019. Filter bubble. Internet Policy Review 8, 4. DOI= http://dx.doi.org/10.14763/2019.4.1426.
[8]
Buchanan, G. and McKay, D., 2011. In the Bookshop: Examining Popular Search Strategies. In Proc. JCDL 11 (Ottawa, Canada), ACM, 269-278. DOI= http://dx.doi.org/10.1145/1998076.1998127.
[9]
Chan, M.-p.S., Jones, C.R., Hall Jamieson, K., and Albarracín, D., 2017. Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation. Psyc. Sci 28, 11 (2017/11/01), 1531-1546. DOI= http://dx.doi.org/10.1177/0956797617714579.
[10]
Chen, X., Sin, S.-C.J., Theng, Y.-L., and Lee, C.S., 2015. Why Do Social Media Users Share Misinformation? In Proc. JCDL 15 (Knoxville, Tennessee, USA), Association for Computing Machinery, 111–114. DOI= http://dx.doi.org/10.1145/2756406.2756941.
[11]
Clarke, C.L., Kolla, M., Cormack, G.V., Vechtomova, O., Ashkan, A., Büttcher, S., and MacKinnon, I., 2008. Novelty and diversity in information retrieval evaluation. In Proc. SIGIR 08, ACM, 659-666. DOI= http://dx.doi.org/10.1145/1390334.1390446.
[12]
Colleoni, E., Rozza, A., and Arvidsson, A., 2014. Echo chamber or public sphere? Predicting political orientation and measuring political homophily in Twitter using big data. J Comm 64, 2, 317-332.
[13]
Colliander, J., 2019. “This is fake news”: Investigating the role of conformity to other users’ views when commenting on and spreading disinformation in social media. Comp. Hum. Behav. 97(2019/08/01/), 202-215. DOI= http://dx.doi.org/10.1016/j.chb.2019.03.032.
[14]
Connaway, L.S., Julien, H., Seadle, M., and Kasprak, A., 2017. Digital literacy in the era of fake news: Key roles for information professionals. ASIST Proceedings 54, 1, 554-555. DOI= http://dx.doi.org/10.1002/pra2.2017.14505401070.
[15]
Cowgill, B., Dell'Acqua, F., Deng, S., Hsu, D., Verma, N., and Chaintreau, A., 2020. Biased Programmers? Or Biased Data? A Field Experiment in Operationalizing AI Ethics. In Proc. EC20 (Virtual Event, Hungary), Association for Computing Machinery, 679–681. DOI= http://dx.doi.org/10.1145/3391403.3399545.
[16]
Cuan-Baltazar, J.Y., Muñoz-Perez, M.J., Robledo-Vega, C., Pérez-Zepeda, M.F., and Soto-Vega, E., 2020. Misinformation of COVID-19 on the Internet: Infodemiology Study. JMIR 6, 2 (2020/4/9), e18444. DOI= http://dx.doi.org/10.2196/18444.
[17]
Fakis, A., Hilliam, R., Stoneley, H., and Townend, M., 2013. Quantitative Analysis of Qualitative Information From Interviews: A Systematic Literature Review. J Mixed Methods Res 8, 2 (2014/04/01), 139-161. DOI= http://dx.doi.org/10.1177/1558689813495111.
[18]
Fletcher, R. and Nielsen, R.K., 2017. Are people incidentally exposed to news on social media? A comparative analysis. New Media & Society 20, 7 (2018/07/01), 2450-2468. DOI= http://dx.doi.org/10.1177/1461444817724170.
[19]
Flintham, M., Karner, C., Bachour, K., Creswick, H., Gupta, N., and Moran, S., 2018. Falling for Fake News: Investigating the Consumption of News via Social Media. In Proc. CHI 18 (Montreal QC, Canada), ACM, New York, NY, 1-10. DOI= http://dx.doi.org/10.1145/3173574.3173950.
[20]
Fornacciari, P., Mordonini, M., Poggi, A., Sani, L., and Tomaiuolo, M., 2018. A holistic system for troll detection on Twitter. Comput. Hum. Behav. 89, 258-268. DOI= http://dx.doi.org/10.1016/j.chb.2018.08.008.
[21]
Guess, A., Nagler, J., and Tucker, J., 2019. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances 5, 1, eaau4586. DOI= http://dx.doi.org/10.1126/sciadv.aau4586.
[22]
Hansen, P.G. and Jespersen, A.M., 2013. Nudge and the manipulation of choice: A framework for the responsible use of the nudge approach to behaviour change in public policy. Eur J Risk Regulation 4, 1, 3-28.
[23]
Helberger, N., 2011. Diversity by Design. J. Inf Policy 1, 441-469. DOI= http://dx.doi.org/10.5325/jinfopoli.1.2011.0441.
[24]
Helberger, N., Karppinen, K., and D'Acunto, L., 2018. Exposure diversity as a design principle for recommender systems. Inf. Comm & Soc. 21, 2 (2018/02/01), 191-207. DOI= http://dx.doi.org/10.1080/1369118X.2016.1271900.
[25]
Helberger, N., Kleinen-von Könisglöw, K., and van der Noll, R., 2015. Regulating the new information intermediaries as gatekeepers of information diversity. Information and Media 17, 6, 50-71.
[26]
Ibbetson, C., 2020. Brits support new lockdown rules, but many think they don't go far enough, YouGov.
[27]
Islam, M.S., Sarkar, T., Khan, S.H., Mostofa Kamal, A.-H., Hasan, S.M.M., Kabir, A., Yeasmin, D., Islam, M.A., Amin Chowdhury, K.I., Anwar, K.S., Chughtai, A.A., and Seale, H., 2020. COVID-19–Related Infodemic and Its Impact on Public Health: A Global Social Media Analysis. Am. J Tropical Medicine & Hygiene 103, 4, 1621-1629. DOI= http://dx.doi.org/10.4269/ajtmh.20-0812.
[28]
Jansen, B.J. and Spink, A., 2006. How are we searching the World Wide Web? A comparison of nine search engine transaction logs. IP&M 42, 1, 248-263. DOI= http://dx.doi.org/10.1016/j.ipm.2004.10.007.
[29]
Kaminskas, M. and Bridge, D., 2016. Diversity, Serendipity, Novelty, and Coverage: A Survey and Empirical Analysis of Beyond-Accuracy Objectives in Recommender Systems. ToIS 7, 1, 1-42. DOI= http://dx.doi.org/10.1145/2926720.
[30]
Knijnenburg, B.P., Willemsen, M.C., Gantner, Z., Soncu, H., and Newell, C., 2012. Explaining the user experience of recommender systems. User Modeling and User-Adapted Interaction 22, 4-5, 441-504.
[31]
Lange, P.G., 2021. Conceptualizing communities of truth on YouTube. Explorations in Media Ecology 20, 1, 33-54.
[32]
Lewandowsky, S., Ecker, U.K.H., Seifert, C.M., Schwarz, N., and Cook, J., 2012. Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psyc Sci in Public Interest 13, 3 (2012/12/01), 106-131. DOI= http://dx.doi.org/10.1177/1529100612451018.
[33]
Lin, F.-R., Hsieh, L.-S., and Chuang, F.-T., 2009. Discovering genres of online discussion threads via text mining. Comput & Ed. 52, 2 (2009/02/01/), 481-495. DOI= http://dx.doi.org/10.1016/j.compedu.2008.10.005.
[34]
Madden, A., Ruthven, I., and McMenemy, D., 2013. A classification scheme for content analyses of YouTube video comments. J Doc 69, 5, 693-714. DOI= http://dx.doi.org/10.1108/JD-06-2012-0078.
[35]
Makri, S., Blandford, A., Woods, M., Sharples, S., and Maxwell, D., 2014. “Making my own luck”: Serendipity strategies and how to support them in digital information environments. JASIST 65, 11, 2179-2194. DOI= http://dx.doi.org/10.1002/asi.23200.
[36]
Marchi, R., 2012. With Facebook, Blogs, and Fake News, Teens Reject Journalistic “Objectivity”. J Comm Inquiry 36, 3 (2012/07/01), 246-262. DOI= http://dx.doi.org/10.1177/0196859912458700.
[37]
Marshall, C.C., 1998. Toward an ecology of hypertext annotation. In Proc. HT 98 (Pittsburgh, Pennsylvania, USA), Association for Computing Machinery, 40–49. DOI= http://dx.doi.org/10.1145/276627.276632.
[38]
McKay, D., Chang, S., Smith, W., and Buchanan, G., 2019. The Things We Talk About When We Talk About Browsing: An Empirical Typology of Library Browsing Behavior. JASIST 70, 12, 1383-1394. DOI= http://dx.doi.org/10.1002/asi.24200.
[39]
McKay, D., Makri, S., Chang, S., and Buchanan, G., 2020. On Birthing Dancing Stars: The Need for Bounded Chaos in Information Interaction. In Proc. CHIIR 2020 (Vancouver BC, Canada), Association for Computing Machinery, 292–302. DOI= http://dx.doi.org/10.1145/3343413.3377983.
[40]
Mckay, D., Makri, S., Gutierrez-Lopez, M., MacFarlane, A., Missaoui, S., Porlezza, C., and Cooper, G., 2020. We are the Change that we Seek: Information Interactions During a Change of Viewpoint. In Proc. CHIIR 20 (Vancouver BC, Canada), Association for Computing Machinery, 173–182. DOI= http://dx.doi.org/10.1145/3343413.3377975.
[41]
Morrison, J., 2017. Finishing the “Unfinished” Story. Digital Journalism 5, 2 (2017/02/07), 213-232. DOI= http://dx.doi.org/10.1080/21670811.2016.1165129.
[42]
Nguyen, N.P., Yan, G., Thai, M.T., and Eidenbenz, S., 2012. Containment of misinformation spread in online social networks. In Proc. WebSci '12 (Evanston, Illinois), Association for Computing Machinery, 213–222. DOI= http://dx.doi.org/10.1145/2380718.2380746.
[43]
Nielsen, R.K. and Graves, L., 2017. "News you don't believe": Audience perspectives on fake news. https://ora.ox.ac.uk/objects/uuid:6eff4d14-bc72-404d-b78a-4c2573459ab8.
[44]
Nonnecke, B. and Preece, J., 2000. Lurker demographics: counting the silent. In Proc. CHI 00 (The Hague, The Netherlands), Association for Computing Machinery, 73–80. DOI= http://dx.doi.org/10.1145/332040.332409.
[45]
Nyhan, B. and Reifler, J., 2010. When Corrections Fail: The Persistence of Political Misperceptions. Political Behaviour 32, 2 (2010/06/01), 303-330. DOI= http://dx.doi.org/10.1007/s11109-010-9112-2.
[46]
O'Donnell, V. and Jowett, G.S., 1992. Chapter 4. In Propaganda and Persuasion Sage, 122-154.
[47]
Ostermaier-Grabow, A. and Linek, S.B., 2019. Communication and Self-Presentation Behavior on Academic Social Networking Sites: An Exploratory Case Study on Profiles and Discussion Threads on ResearchGate. JASIST 70, 10 (2019/10/01), 1153-1164. DOI= http://dx.doi.org/10.1002/asi.24186.
[48]
Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A.A., Eckles, D., and Rand, D.G., 2021. Shifting attention to accuracy can reduce misinformation online. Nature 592, 7855 (2021/04/01), 590-595. DOI= http://dx.doi.org/10.1038/s41586-021-03344-2.
[49]
Pennycook, G. and Rand, D.G., 2019. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188(2019/07/01/), 39-50. DOI= http://dx.doi.org/10.1016/j.cognition.2018.06.011.
[50]
Reviglio, U., 2017. Serendipity by design? How to turn from diversity exposure to diversity experience to face filter bubbles in social media. In International Conference on Internet Science Springer, 281-300.
[51]
Robertson, R.E., Jiang, S., Joseph, K., Friedland, L., Lazer, D., and Wilson, C., 2018. Auditing Partisan Audience Bias within Google Search. In Proc. CSCW 18 (Austin, TX), New York NY, 1-22. DOI= http://dx.doi.org/10.1145/3274417.
[52]
Rubin Victoria, L., 2019. Disinformation and misinformation triangle: A conceptual model for “fake news” epidemic, causal factors and interventions. J Doc 75, 5, 1013-1034. DOI= http://dx.doi.org/10.1108/JD-12-2018-0209.
[53]
Saling, L.L., Mallal, D., Scholer, F., Skelton, R., and Spina, D., 2021. No one is immune to misinformation: An investigation of misinformation sharing by subscribers to a fact-checking newsletter. PLOS ONE 16, 8, e0255702. DOI= http://dx.doi.org/10.1371/journal.pone.0255702.
[54]
Savolainen, R., 1995. Everyday life information seeking: Approaching information seeking in the context of “way of life”. L&ISR 17, 3 (1995/06/01/), 259-294. DOI= http://dx.doi.org/10.1016/0740-8188(95)90048-9.
[55]
Schudson, M., 2001. The objectivity norm in American journalism*. Journalism 2, 2 (2001/08/01), 149-170. DOI= http://dx.doi.org/10.1177/146488490100200201.
[56]
Scott, M., 2020. Facebook's private groups are abuzz with coronavirus fake news. In Politico Axel Springer, Germany.
[57]
Seargeant, P. and Tagg, C., 2019. Social media and the future of open debate: A user-oriented approach to Facebook's filter bubble conundrum. Doiscourse Context and Media 27(2019/03/01/), 41-48. DOI= http://dx.doi.org/10.1016/j.dcm.2018.03.005.
[58]
Stefanone, M.A., Vollmer, M., and Covert, J.M., 2019. In News We Trust? Examining Credibility and Sharing Behaviors of Fake News. In Proc. SMS19 (Toronto, ON, Canada), Association for Computing Machinery, 136–147. DOI= http://dx.doi.org/10.1145/3328529.3328554.
[59]
Strickland, E., 2018. AI-human partnerships tackle "fake news": Machine learning can get you only so far-then human judgment is required. IEEE Spectrum 55, 9, 12-13. DOI= http://dx.doi.org/10.1109/MSPEC.2018.8449036.
[60]
Sullivan, M.C., 2018. Why librarians can't fight fake news. JLIS 51, 4 (2019/12/01), 1146-1156. DOI= http://dx.doi.org/10.1177/0961000618764258.
[61]
Tandoc Jr, E.C., Lim, Z.W., and Ling, R., 2018. Defining “fake news” A typology of scholarly definitions. Digital Journalism 6, 2, 137-153.
[62]
Ventirozos, F.K., Varlamis, I., and Tsatsaronis, G., 2018. Detecting Aggressive Behavior in Discussion Threads Using Text Mining. In Proc. CICLing 18 (Mexico City), Springer International Publishing, Berlin, Germany, 420-431. DOI= http://dx.doi.org/10.1007/978-3-319-77116-8_31.
[63]
Vosoughi, S., Roy, D., and Aral, S., 2018. The spread of true and false news online. Science 359, 6380, 1146-1151. DOI= http://dx.doi.org/10.1126/science.aap9559.
[64]
Wahl-Jorgensen, K., 2016. Emotion and journalism. In The SAGE Handbook of Digital Journalism SAGE, 128-143.
[65]
White, R.W. and Roth, R.A., 2009. Exploratory search: Beyond the query-response paradigm. Synthesis lectures on information concepts, retrieval, and services 1, 1, 1-98.
[66]
Wilson, J., 2016. Agenda 21 is conspiracy theory. But don't dismiss Malcolm Roberts as a harmless kook, The Guardian. Guardian Media Ltd., London, UK, https://www.theguardian.com/commentisfree/2016/sep/14/agenda-21-is-conspiracy-theory-but-dont-dismiss-malcolm-roberts-as-a-harmless-kook.
[67]
Wu, L., Morstatter, F., Carley, K.M., and Liu, H., 2019. Misinformation in Social Media: Definition, Manipulation, and Detection. SIGKDD Explor. Newsl. 21, 2, 80–90. DOI= http://dx.doi.org/10.1145/3373464.3373475.
[68]
Yarmand, M., Yoon, D., Dodson, S., Roll, I., and Fels, S.S., 2019. "Can you believe [1:21]?!": Content and Time-Based Reference Patterns in Video Comments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems Association for Computing Machinery, Paper 489. DOI= http://dx.doi.org/10.1145/3290605.3300719.
[69]
Yom-Tov, E., Dumais, S., and Guo, Q., 2013. Promoting Civil Discourse Through Search Engine Diversity. Soc Sci Comp Rev 32, 2 (2014/04/01), 145-154. DOI= http://dx.doi.org/10.1177/0894439313506838.
[70]
Zannettou, S., Sirivianos, M., Blackburn, J., and Kourtellis, N., 2019. The Web of False Information: Rumors, Fake News, Hoaxes, Clickbait, and Various Other Shenanigans. J. Data and Inf. Quality 11, 3, Article 10. DOI= http://dx.doi.org/10.1145/3309699.

Cited By

View all
  • (2025)Current Trends in Information Behavior Research: Expanding Beyond Search, Seeking, Finding and BehaviorEncyclopedia of Libraries, Librarianship, and Information Science10.1016/B978-0-323-95689-5.00172-3(493-500)Online publication date: 2025
  • (2024)Identifying textual disinformation using Large Language ModelsProceedings of the 2024 Conference on Human Information Interaction and Retrieval10.1145/3627508.3638315(453-456)Online publication date: 10-Mar-2024

Index Terms

  1. Reading Between the Lies: A Classification Scheme of Types of Reply to Misinformation in Public Discussion Threads
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        CHIIR '22: Proceedings of the 2022 Conference on Human Information Interaction and Retrieval
        March 2022
        399 pages
        ISBN:9781450391863
        DOI:10.1145/3498366
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 14 March 2022

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Misinformation
        2. echo chambers
        3. information sharing

        Qualifiers

        • Research-article
        • Research
        • Refereed limited

        Conference

        CHIIR '22
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 55 of 163 submissions, 34%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)43
        • Downloads (Last 6 weeks)4
        Reflects downloads up to 23 Oct 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2025)Current Trends in Information Behavior Research: Expanding Beyond Search, Seeking, Finding and BehaviorEncyclopedia of Libraries, Librarianship, and Information Science10.1016/B978-0-323-95689-5.00172-3(493-500)Online publication date: 2025
        • (2024)Identifying textual disinformation using Large Language ModelsProceedings of the 2024 Conference on Human Information Interaction and Retrieval10.1145/3627508.3638315(453-456)Online publication date: 10-Mar-2024

        View Options

        Get Access

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        HTML Format

        View this article in HTML Format.

        HTML Format

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media