Skip to Main Content

The Social Dilemma: Making Sense of the World

Center for Humane Technology

The readings below are from the Center for Humane Technology.

These selected readings have been identified to further our understanding of the "Ledger of Harms" that can result from human interaction with technology and social media and the impact on: How We Make Sense of the World, our Social Relationships, Physical and Mental Health, Politics and Government, Systemic Oppression, Attention and Cognition, Future Generations, and How We Treat One Another."

Making Sense of the World

Icon for Making Sense of the World

Making Sense of the World

Misinformation, conspiracy theories, and fake news

Why It Matters

A broken information ecology undermines our ability to understand and act on complex global challenges from Climate Change to COVID 19.

Evidence

64%

of all extremist group joins are due to our recommendation tools...our recommendation systems grow the problem”, noted an internal Facebook presentation in 2016. Yet repeated attempts to counteract this have been repeatedly ignored, diluted, or deliberately shut down by senior Facebook officers, according to a 2020 Wall Street Journal investigation. In 2018, Facebook managers told employees the company’s priorities were shifting “away from societal good to individual value.”

Copy Link · Horowitz, J., & Seetheraman, D., 2020. Wall Street Journal ↗

6X faster

Fake news spreads six times faster than true news. According to researchers, this is because fake news grabs our attention more than authentic information: fake news items usually have a higher emotional content and contain unexpected information which inevitably means that they will be shared and reposted more often.

Copy LinkPEER-REVIEWED · Vosoughi, S., Roy, D., & Aral, S., 2018. Science ↗

Reading a fake news item even once increases the chances of a reader judging that it is true when they next encounter it, even when the news item has been labeled as suspect by fact-checkers or is counter to the reader’s own political standpoint. The damage done by fake news items in the past continues to reverberate today. Psychological mechanisms such as these, twinned with the speed at which fake news travels, highlight our vulnerability demonstrating how we can easily be manipulated by anyone planting fakes news or using bots to spread their own viewpoints.

Copy LinkPEER-REVIEWED · Pennycook, G., Cannon, T., & Rand, D. G., 2018. Journal of Experimental Psychology ↗

45%

of tweets about corona virus are from bots spreading fake information, according to research from Carnegie Mellon University. An analysis of more than 200 million tweets created since January 2020 indicates more than 100 false narratives, including conspiracy theories that hospitals are full of mannequins. Researchers note that these posts appear to be aimed at sowing division within America, commenting “We do know that it looks like a propaganda machine”.

Copy LinkJOURNALISM · Allyn, B., 2020. National Public Radio ↗

As the pandemic develops, there has been a significant increase in posting fake news and false information even among human users, due to the algorithms underlying social media platforms. Researchers note that people naturally repost messages on the basis of their popularity, rather than their accuracy. Fact-checking has been unable to keep pace. Such false information is particularly dangerous because, as noted above, it tends to be retained for a long time, irrespective of fact correction.

Copy LinkJOURNALISM · Schalit, N., 2020. The Conversation ↗

The primary driving force behind whether someone will share a piece of information is not its accuracy or even its content; the main reason we share a post is because it comes from a friend or a celebrity with whom we want to be associated. As humans, we’re often more concerned with status, popularity, and establishing a trusted “friends” circle, than with maintaining the truth. As a result, social media spaces will inevitably be spaces where the truth is easily downgraded.

Copy LinkPEER-REVIEWED · Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G., . . . Quattrociocchi, W., 2016. PNAS ↗

2 minutes

of exposure to a conspiracy theory video reduces people’s pro-social attitudes (such as their willingness to help others), as well as reducing their belief in established scientific facts.

Copy LinkPEER-REVIEWED · van der Linden, S., 2015. Personality and Individual Differences ↗

Anger is the emotion that travels fastest and farthest on social media, compared to all other emotions. As a result, those who post angry messages will inevitably have the greatest influence, and social media platforms will tend to be dominated by anger.

Copy LinkPEER-REVIEWED · Fan, R., 2014. PLoS ONE ↗

17%

Each word of moral outrage added to a tweet increases the rate of retweets by 17%. It takes very little effort to tip the emotional balance within social media spaces, catalyzing and accelerating further polarization.

Copy LinkPEER-REVIEWED · Brady, W. J., Wils, J. A., Just, J. T., Tucker, J. A., & van Bavel, J. J., 2017. PNAS ↗

Analysis indicates that bots wield a disproportionate influence, dominating social media platforms such as Twitter. An estimated 66% of tweeted links to popular websites are tweeted by bots, with this number climbing to 89% for popular news sites. In addition, bots overwhelmed human users: in this study, 500 bots were responsible for 22% of the tweets, compared to the top 500 human users who only accounted for 6% of tweets. As a result, those who create bots can manipulate and artificially tilt the balance of shared social spaces.

Copy LinkPRIVATE STUDY · , 2018. Pew Research Center ↗

Twitter now plays a key role in how journalists find news. According to a recent survey, many journalists see tweets as equally newsworthy compared to headlines from the Associated Press. As a result, the neutrality of the press can be easily undermined: on the one hand professional journalists can be manipulated by bots and bad faith actors and on the other hand, the chance of radical content, conspiracies and other types of disinformation occurring in professional news articles are extremely high.

Copy LinkPEER-REVIEWED · McGregor and Molyneux, 2018. Journal of Journalism ↗

An Oxford research study of 22 million tweets showed that Twitter users had shared more “misinformation, polarizing, and conspiratorial content” than had shared actual news stories

Copy LinkPEER-REVIEWED · Howard, P., & al, e., 2017. Data Memo ↗

Analysis indicates that foreign governments place and promote misinformation stories on multiple social media channels, creating the illusion of known truth emerging from diverse "independent" sources.

Copy LinkCONFERENCE PROCEEDINGS · Starbird, Arif, Wilson, Van Koevering, Yefimova, and Scarnecchia, 2018. Association for the Advancement of Artificial Intelligence Publications ↗

Using a wide range of tricky techniques, malicious actors of all types use social media to rapidly advance their agenda. They have developed sophisticated media manipulation strategies, including hijacking existing memes and seeding false narratives widely. Manuals for journalists and other media professionals to defend against these strategies naturally lag far behind, and are just starting to be developed in civil society.

Copy LinkPRIVATE STUDY · Phillips, 2018. Data and Society ↗

Fake news items contain more anger than posts of real news. According to research conducted with more than 1,000 active users on China' Weibo platform, angry posts generate more anxiety and this in turn motivates readers to share them further. Analyzing over 30,000 posts on Weibo, the researchers found that fake news posts contained 17% fewer "joy" words but 6% more "anger" words compared to real news posts. They found similar trends in an analysis of 40,000 posts on Twitter.

Copy LinkJOURNALISM · Lu, D. , 2020. New Scientist ↗

Politics and Elections

Icon for Politics and Elections
Politics and Elections

Propaganda, distorted dialogue & a disrupted democratic process

Why It Matters

Social media platforms are incentivized to amplify the most engaging content, tilting public attention towards polarizing and often misleading content. By selling micro targeting to the highest bidder, they enable manipulative practices that undermine democracies around the world.

Evidence

Fake news stories posted before the 2016 US elections were still in the top 10 news stories circulating across Twitter almost 2 years later, indicating the staying power of such stories and their long-term impact on ongoing political dialogue.

Copy LinkPRIVATE STUDY · Hindman, & Barash, 2018. Knight Foundation ↗

More fake political headlines were shared on Facebook than real ones during the last 3 months of the 2016 US elections.

Copy LinkJOURNALISM · Silverman, C., 2016. Buzzfeed ↗

Exposure to a fake political news story can rewire your memories: in a study, where over 3,000 voters were shown fake stories, many voters later not only “remembered” the fake stories as if they were real events but also "remembered" additional, rich details of how and when the events took place.

Copy LinkPEER-REVIEWED · Murphy, G., Loftus, E., Grady, R., Levine, L. J., & Greene, C. M., 2019. Psychological Science ↗

The most popular news story of the 2016 US elections was fake. In fact, three times as many Americans read and shared it on their social media accounts as they did the top-performing article from the New York Times. (The fake news story alleged that the Pope endorsed Donald Trump for President).

Copy LinkJOURNALISM · Silverman, C., 2016. Buzzfeed ↗

150 million

Americans were reached by Russian propaganda posts on Facebook during the 2016 US elections, according to Facebook's estimates.

Copy LinkJOURNALISM · Ackerman, S., 2017. The Daily Beasy ↗

Game theory analysis has shown how a few bots with extreme political views, carefully placed within a network of real people, can have a disproportionate effect within current social media systems. Studies demonstrate how an extremist minority political group can have undue influence using such bots—for example, reversing a 2:1 voter difference to win a majority of the votes.

Copy LinkPEER-REVIEWED · tewart, A. J., Mosleh, M., & Diakonova, M., 2019. Nature ↗

Analyzing over 2 million recommendations and 72 million comments on YouTube in 2019, researchers demonstrated that viewers consistently moved from watching moderate to extremist videos; simulation experiments run on YouTube revealed that its recommendation system steers viewers towards politically extreme content. The study notes "a comprehensive picture of user radicalization on YouTube".

Copy LinkCONFERENCE PROCEEDINGS · Ribeiro, M. H., Ottoni, R., West, R., Almeida, V. A. F., & Meira, W., 2020. Association for Computing Machinery ↗

The order in which search engines present results has a powerful impact on users' political opinions. Experimental studies show that when undecided voters search for information about political candidates, more than 20% will change their opinion based on the ordering of their search results. Few people are aware of bias in search engine results or how their own choice of political candidate changed as a result.

Copy LinkPEER-REVIEWED · Epstein, M. S., & Robertson, R. E., 2015. Proceedings of the National Academy of Sciences ↗

The outcomes of elections around the world are being more easily manipulated via social media: during the 2018 Mexican election, 25% of Facebook and Twitter posts were created by bots and trolls; during Ecuador's 2017 elections, president Lenin Moreno's advisors bought tens of thousands of fake followers; China's state-run news agency (Xinhua) has paid for hundreds of thousands of fake followers, tweeting propaganda to the Twitter accounts of Western users.

Copy LinkJOURNALISM · Confessore, N., Dance, G., Harris, R., & Hansen, M., 2018. New York Times ↗

The 2017 genocide in Myanmar was exacerbated by unmoderated fake news, with only 4 Burmese speakers at Facebook to monitor its 7.3 million Burmese users.

Copy LinkJOURNALISM · Stecklow, S., 2018. Reuters ↗