Skip to Main Content

MIT Libraries logo

MIT logo Search Account

Disinformation : Fall 2022 Event

Reports, academic article, case studies

Disinformation Event

Panel

  • Adam Berinsky, Mitsui Professor of Political Science and Director of the MIT Political Experiments Research Lab
  • David Karger, Professor of Computer Science and member of the MIT Computer Science and Artificial Intelligence Laboratory
  • David Rand, Erwin H. Schell Professor and Professor of Management Science and Brain and Cognitive Sciences at MIT
  • ModeratorAlexia Hudson-Ward, Associate Director of Research and Learning, MIT Libraries

Video

Selected work by panelists

Adam Berinsky

Adam Berinsky Google Scholar profile

Selected work: 

Brashier, N. M., Pennycook, G., Berinsky, A. J., & Rand, D. G. (2021). Timing matters when correcting fake news. Proceedings of the National Academy of Sciences, 118(5), https://www.pnas.org/doi/pdf/10.1073/pnas.2020043118 

DeVerna, M. R., Guess, A. M., Berinsky, A. J., Tucker, J. A., & Jost, J. T. (2022). Rumors in Retweet: Ideological Asymmetry in the Failure to Correct Misinformation. Personality and Social Psychology Bulletin. https://journals-sagepub-com.libproxy.mit.edu/doi/full/10.1177/01461672221114222 

Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science, 359(6380), 1094–1096. https://scholar.harvard.edu/files/mbaum/files/science_of_fake_news.pdf

Swire, B., Berinsky, A. J., Lewandowsky, S., & Ecker, U. K. H. (n.d.). Processing political misinformation: Comprehending the Trump phenomenon. Royal Society Open Science, 4(3), 160802. https://royalsocietypublishing.org/doi/full/10.1098/rsos.160802 

Swire-Thompson, B., Ecker, U. K., Lewandowsky, S., & Berinsky, A. J. (2020). They might be a liar but they’re my liar: Source evaluation and the prevalence of misinformation. Political Psychology, 41(1), 21–34. https://onlinelibrary.wiley.com/doi/pdf/10.1111/pops.12586?casa_token=g8HztZqgD1QAAAAA:HFOcHpeV3KvpAxgw3I5V9y1mC1krFdCG04aauqHEyWDqWtt4BZ31J0VTELxbzwkfj4LpYhXYYiJzUFki

Wittenberg, C., & Berinsky, A. J. (2020). Misinformation and Its Correction. In J. A. Tucker & N. Persily (Eds.), Social Media and Democracy: The State of the Field, Prospects for Reform (pp. 163–198). Cambridge University Press. https://www.cambridge.org/core/books/social-media-and-democracy/misinformation-and-its-correction/61FA7FD743784A723BA234533012E810

VIDEO: 

Stanford Cyber Policy Center. (2020, July 29). Exploring Potential “Solutions” to Online Disinformation. https://www.youtube.com/watch?v=Rptg9wuOGkQ 

Adam Berinsky: Political Misinformation in the Modern Day. (2019, February 15). https://shorensteincenter.org/adam-berinsky-political-misinformation-modern-day/

David R Karger

David Karger Google Scholar profile

Selected work: 

Jahanbakhsh, F. (2022). Our Browser Extension Lets Readers Change the Headlines on News Articles, and You Won’t Believe What They Did! Proc. ACM Hum.-Comput. Interact, 6, 33. https://people.csail.mit.edu/farnazj/pdfs/Study_of_News_Headlines__CSCW_22.pdf

Karger, D. (2022, July 27). Special Talk: Empowering Participants with Richer, more Expressive Online Discussion Platforms. https://www.cs.cmu.edu/calendar/161490056  (Note: No video or transcript, just short description of talk)

Rainie, L., Anderson, J., & Albright, J. (2017, March 29). The Future of Free Speech, Trolls, Anonymity and Fake News Online. Pew Research Center: Internet, Science & Tech. https://www.pewresearch.org/internet/2017/03/29/the-future-of-free-speech-trolls-anonymity-and-fake-news-online/

Wang, J. Z., Zhang, A. X., & Karger, D. R. (2022). Designing for Engaging with News using Moral Framing towards Bridging Ideological Divides. Proceedings of the ACM on Human-Computer Interaction, 6(GROUP), 1–23. https://dl.acm.org/doi/pdf/10.1145/3492861 

Zhang, A. X., Ranganathan, A., Metz, S. E., Appling, S., Sehat, C. M., Gilmore, N., Adams, N. B., Vincent, E., Lee, J., & Robbins, M. (2018). A structured response to misinformation: Defining and annotating credibility indicators in news articles. Proceedings of the ACM on Human-Computer Interaction 2018, 603–612. https://assets.ctfassets.net/tlowcqj4pb76/4lmUdUz36gQuOKO0UOwEU2/819497f46f25a9cfcbaa7d4b5db8e354/CredCoWebConf2018.pdf 

VIDEO: DIMACS CCICADA (Director). (2022, May 27). David Karger: Empowering End Users to Make Choices on Harassment, Misinformation, Free Expression... https://www.youtube.com/watch?v=Oy_URr-EaZE

David Rand

David Rand Google Scholar profile

Selected work: 

Arechar, A. A., Allen, J. N. L., Berinsky, A., Cole, R., Epstein, Z., Garimella, K., Gully, A., Lu, J. G., Ross, R. M., Stagnaro, M., Zhang, J., Pennycook, G., & Rand, D. (2022). Understanding and Reducing Online Misinformation Across 16 Countries on Six Continents. PsyArXiv. https://doi.org/10.31234/osf.io/a9frz 

Bago, B., Rand, D. G., & Pennycook, G. (2020). Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines. Journal of Experimental Psychology: General, 149(8), 1608–1613. https://doi.org/10.1037/xge0000729

Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking. Journal of Applied Research in Memory and Cognition, 8(1), 108–117. https://dspace.mit.edu/bitstream/handle/1721.1/135512/SSRN-id3172140.pdf?sequence=2&isAllowed=y 

Epstein, Z., Pennycook, G., & Rand, D. (2020). Will the Crowd Game the Algorithm? Using Layperson Judgments to Combat Misinformation on Social Media by Downranking Distrusted Sources. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–11. https://doi.org/10.1145/3313831.3376232

Lin, H., Pennycook, G., & Rand, D. (2022). Thinking more or thinking differently? Using drift-diffusion modeling to illuminate why accuracy prompts decrease misinformation sharing. PsyArXiv. https://psyarxiv.com/kf8md/ 

Mosleh, M., Martel, C., Eckles, D., & Rand, D. (2021). Perverse Downstream Consequences of Debunking: Being Corrected by Another User for Posting False Political News Increases Subsequent Sharing of Low Quality, Partisan, and Toxic Content in a Twitter Field Experiment. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–13. https://doi.org/10.1145/3411764.3445642

Mosleh, M., & Rand, D. (2021). Falsehood in, falsehood out: A tool for measuring exposure to elite misinformation on Twitter. https://psyarxiv.com/ye3pf/download?format=pd

Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2021). Shifting attention to accuracy can reduce misinformation online. Nature, 592(7855), 590–595. https://doi.org/10.1038/s41586-021-03344-2

Sirlin, N., Epstein, Z., Arechar, A. A., & Rand, D. G. (2021). Digital literacy is associated with more discerning accuracy judgments but not sharing intentions. Harvard Kennedy School Misinformation Review. https://doi.org/10.37016/mr-2020-83

Stewart, A. J., Arechar, A. A., Rand, D. G., & Plotkin, J. B. (2021). The coercive logic of fake news. arXiv. http://arxiv.org/abs/2108.13687

PODCAST: The Human Error Behind Fake News with David Rand. (2021). Retrieved September 23, 2022, from https://thedecisionlab.com/podcasts/the-human-error-behind-fake-news-with-david-rand