The readings below are from the Center for Humane Technology.
These selected readings have been identified to further our understanding of the "Ledger of Harms" that can result from human interaction with technology and social media and the impact on: How We Make Sense of the World, our Social Relationships, Physical and Mental Health, Politics and Government, Systemic Oppression, Attention and Cognition, Future Generations, and How We Treat One Another."
of all extremist group joins are due to our recommendation tools...our recommendation systems grow the problem”, noted an internal Facebook presentation in 2016. Yet repeated attempts to counteract this have been repeatedly ignored, diluted, or deliberately shut down by senior Facebook officers, according to a 2020 Wall Street Journal investigation. In 2018, Facebook managers told employees the company’s priorities were shifting “away from societal good to individual value.”
Fake news spreads six times faster than true news. According to researchers, this is because fake news grabs our attention more than authentic information: fake news items usually have a higher emotional content and contain unexpected information which inevitably means that they will be shared and reposted more often.
Copy LinkPEER-REVIEWED · Vosoughi, S., Roy, D., & Aral, S., 2018. Science ↗
Reading a fake news item even once increases the chances of a reader judging that it is true when they next encounter it, even when the news item has been labeled as suspect by fact-checkers or is counter to the reader’s own political standpoint. The damage done by fake news items in the past continues to reverberate today. Psychological mechanisms such as these, twinned with the speed at which fake news travels, highlight our vulnerability demonstrating how we can easily be manipulated by anyone planting fakes news or using bots to spread their own viewpoints.
of tweets about corona virus are from bots spreading fake information, according to research from Carnegie Mellon University. An analysis of more than 200 million tweets created since January 2020 indicates more than 100 false narratives, including conspiracy theories that hospitals are full of mannequins. Researchers note that these posts appear to be aimed at sowing division within America, commenting “We do know that it looks like a propaganda machine”.
As the pandemic develops, there has been a significant increase in posting fake news and false information even among human users, due to the algorithms underlying social media platforms. Researchers note that people naturally repost messages on the basis of their popularity, rather than their accuracy. Fact-checking has been unable to keep pace. Such false information is particularly dangerous because, as noted above, it tends to be retained for a long time, irrespective of fact correction.
The primary driving force behind whether someone will share a piece of information is not its accuracy or even its content; the main reason we share a post is because it comes from a friend or a celebrity with whom we want to be associated. As humans, we’re often more concerned with status, popularity, and establishing a trusted “friends” circle, than with maintaining the truth. As a result, social media spaces will inevitably be spaces where the truth is easily downgraded.
of exposure to a conspiracy theory video reduces people’s pro-social attitudes (such as their willingness to help others), as well as reducing their belief in established scientific facts.
Anger is the emotion that travels fastest and farthest on social media, compared to all other emotions. As a result, those who post angry messages will inevitably have the greatest influence, and social media platforms will tend to be dominated by anger.
Each word of moral outrage added to a tweet increases the rate of retweets by 17%. It takes very little effort to tip the emotional balance within social media spaces, catalyzing and accelerating further polarization.
Analysis indicates that bots wield a disproportionate influence, dominating social media platforms such as Twitter. An estimated 66% of tweeted links to popular websites are tweeted by bots, with this number climbing to 89% for popular news sites. In addition, bots overwhelmed human users: in this study, 500 bots were responsible for 22% of the tweets, compared to the top 500 human users who only accounted for 6% of tweets. As a result, those who create bots can manipulate and artificially tilt the balance of shared social spaces.
Twitter now plays a key role in how journalists find news. According to a recent survey, many journalists see tweets as equally newsworthy compared to headlines from the Associated Press. As a result, the neutrality of the press can be easily undermined: on the one hand professional journalists can be manipulated by bots and bad faith actors and on the other hand, the chance of radical content, conspiracies and other types of disinformation occurring in professional news articles are extremely high.
Copy LinkPEER-REVIEWED · McGregor and Molyneux, 2018. Journal of Journalism ↗
An Oxford research study of 22 million tweets showed that Twitter users had shared more “misinformation, polarizing, and conspiratorial content” than had shared actual news stories
Analysis indicates that foreign governments place and promote misinformation stories on multiple social media channels, creating the illusion of known truth emerging from diverse "independent" sources.
Using a wide range of tricky techniques, malicious actors of all types use social media to rapidly advance their agenda. They have developed sophisticated media manipulation strategies, including hijacking existing memes and seeding false narratives widely. Manuals for journalists and other media professionals to defend against these strategies naturally lag far behind, and are just starting to be developed in civil society.
Fake news items contain more anger than posts of real news. According to research conducted with more than 1,000 active users on China' Weibo platform, angry posts generate more anxiety and this in turn motivates readers to share them further. Analyzing over 30,000 posts on Weibo, the researchers found that fake news posts contained 17% fewer "joy" words but 6% more "anger" words compared to real news posts. They found similar trends in an analysis of 40,000 posts on Twitter.
Fake news stories posted before the 2016 US elections were still in the top 10 news stories circulating across Twitter almost 2 years later, indicating the staying power of such stories and their long-term impact on ongoing political dialogue.
More fake political headlines were shared on Facebook than real ones during the last 3 months of the 2016 US elections.
Exposure to a fake political news story can rewire your memories: in a study, where over 3,000 voters were shown fake stories, many voters later not only “remembered” the fake stories as if they were real events but also "remembered" additional, rich details of how and when the events took place.
The most popular news story of the 2016 US elections was fake. In fact, three times as many Americans read and shared it on their social media accounts as they did the top-performing article from the New York Times. (The fake news story alleged that the Pope endorsed Donald Trump for President).
Americans were reached by Russian propaganda posts on Facebook during the 2016 US elections, according to Facebook's estimates.
Game theory analysis has shown how a few bots with extreme political views, carefully placed within a network of real people, can have a disproportionate effect within current social media systems. Studies demonstrate how an extremist minority political group can have undue influence using such bots—for example, reversing a 2:1 voter difference to win a majority of the votes.
Analyzing over 2 million recommendations and 72 million comments on YouTube in 2019, researchers demonstrated that viewers consistently moved from watching moderate to extremist videos; simulation experiments run on YouTube revealed that its recommendation system steers viewers towards politically extreme content. The study notes "a comprehensive picture of user radicalization on YouTube".
The order in which search engines present results has a powerful impact on users' political opinions. Experimental studies show that when undecided voters search for information about political candidates, more than 20% will change their opinion based on the ordering of their search results. Few people are aware of bias in search engine results or how their own choice of political candidate changed as a result.
The outcomes of elections around the world are being more easily manipulated via social media: during the 2018 Mexican election, 25% of Facebook and Twitter posts were created by bots and trolls; during Ecuador's 2017 elections, president Lenin Moreno's advisors bought tens of thousands of fake followers; China's state-run news agency (Xinhua) has paid for hundreds of thousands of fake followers, tweeting propaganda to the Twitter accounts of Western users.
The 2017 genocide in Myanmar was exacerbated by unmoderated fake news, with only 4 Burmese speakers at Facebook to monitor its 7.3 million Burmese users.