Photos+of+Twitter+CEO+Jack+Dorsey%2C+Google%2FAlphabet+CEO+Sundar+Pichai+and+Facebook+CEO+Mark+Zuckerberg+from+2018-2020+%28AP+Photo%2FJose+Luis+Magana%2C+LM+Otero%2C+Jens+Meyer%29%0A

Photos of Twitter CEO Jack Dorsey, Google/Alphabet CEO Sundar Pichai and Facebook CEO Mark Zuckerberg from 2018-2020 (AP Photo/Jose Luis Magana, LM Otero, Jens Meyer)

The Case Against Social Media — Part 2

How tech companies are threatening our democracy

Editor’s Note: This is the second part of a two-part series on Social Media. Read the first part here.

“As long as social media companies profit from outrage, confusion, addiction and depression, our well-being and democracy will continue to be at risk.” — The Center for Humane Technology

It is clear that social media use has negatively impacted our perception of truth and reality. One of the most notable problems in social media companies is their reputation as an information and news platform despite massive amounts of fake news overwhelming truthful reporting.

A 2018 peer-reviewed study from the Journal of Experimental Psychology defined fake news as “entirely fabricated and often partisan content that is presented as factual.” The phenomenon of fake news was brought to the forefront during the 2016 election cycle and has become a defining phrase of our president’s vocabulary. 

According to a peer-reviewed study, “Fake news spreads six times faster than true news. … this is because fake news grabs our attention more than authentic information: fake news items usually have a higher emotional content and contain unexpected information which inevitably means that they will be shared and reposted more often.” Another peer-reviewed study elaborated on this finding, demonstrating that merely each additional word of moral outrage added to a tweet increases the rate of retweets by 17%, supporting that “it takes very little effort to tip the emotional balance within social media spaces, catalyzing and accelerating further polarization.”

Of course, misinformation is impossible to totally snuff out. However, the extreme levels of misinformation we experience today are largely the result of social media algorithms that amplify the most engaging content to maximize company profits. This results in a tilt of public attention towards polarizing and frequently misleading content.

This content doesn’t just vanish and disappear into the digital realm, either, as is evident in a 2018 private study that demonstrates the longevity of fake news: “Fake news stories posted before the 2016 US elections were still in the top 10 news stories circulating across Twitter almost two years later, indicating the staying power of such stories and their long-term impact on ongoing political dialogue.”

Guillaume Chaslot, former engineer at YouTube, explains how these algorithms work in the documentary. “People think the algorithm is designed to give them what they really want, only it’s not. The algorithm is actually trying to find a few rabbit holes that are very powerful, trying to find which rabbit hole is the closest to your interest. And then if you start watching one of those videos, then it will recommend it over and over again.”

“Reading a fake news item even once increases the chances of a reader judging that it is true when they next encounter it, even when the news item has been labeled as suspect by fact-checkers or is counter to the reader’s own political standpoint,” a study from the Journal of Experimental Psychology stated. “Psychological mechanisms such as these, twinned with the speed at which fake news travels, highlight our vulnerability, demonstrating how we can easily be manipulated by anyone planting fake news or using bots to spread their own viewpoints.”

Chaslot continued, “The flat-Earth conspiracy theory was recommended hundreds of millions of times by the algorithm. It’s easy to think that it’s just a few stupid people who get convinced, but the algorithm is getting smarter and smarter every day. So today, they’re convincing people that the Earth is flat, but tomorrow, they will be convincing you of something that’s false.”

“The flat-Earth conspiracy theory was recommended hundreds of millions of times by the algorithm. It’s easy to think that it’s just a few stupid people who get convinced, but the algorithm is getting smarter and smarter every day.

— Guillaume Chaslot

Kyrie Irving, basketball player for the Boston Celtics and formerly for the Cleveland Cavaliers, believed the Earth was flat. He later apologized in 2018 at Forbes’ Under 30 Summit for spreading the conspiracy theory, blaming his belief on the “YouTube rabbit hole.” Tomorrow, like Kyrie, you may be subject to believing false information without even realizing it.

Chaslot’s warning applies directly to current times. Misinformation spreads not only around COVID-19 and its effects but also to its potential cure, the most notable fake assertion that the virus could be treated with disinfectants like bleach.

In “The Social Dilemma,” Justin Rosenstein, former engineer at Google and Facebook explained how misinformation even spreads to Google’s search engine: “When you go to Google and type in ‘Climate change is,’ you’re going to see different results depending on where you live. In certain cities, you’re going to see it autocomplete with ‘climate change is a hoax.’ In other cases, you’re gonna see ‘climate change is causing the destruction of nature.’ And that’s a function not of what the truth is about climate change, but about where you happen to be Googling from and the particular things Google knows about your interests.”

“The order in which search engines present results has a powerful impact on users’ political opinions,” a 2015 National Academy of Sciences study stated regarding the influence of search engines on political opinions and elections. “Experimental studies show that when undecided voters search for information about political candidates, more than 20% will change their opinion based on the ordering of their search results. Few people are aware of bias in search engine results or how their own choice of political candidate changed as a result.”

The implications? These tech companies’ methods are startling when one considers how rogue states like Russia or China may use these resources to spread misinformation and spark controversy. “The manipulation by third parties is not a hack,” says Roger McNamee, an early investor in Facebook, in the documentary. “The Russians didn’t hack Facebook, they used the tools that Facebook created for legitimate advertisers and legitimate users, and used it for a nefarious purpose.”

“One of the problems with Facebook is that, as a tool of persuasion, it may be the greatest thing ever created,” McNamee continued. “Now, imagine what that means in the hands of a dictator or an authoritarian. If you want to control the population of your country, there has never been a tool as effective as Facebook.”

Daniel Palmer, professor of Computer Science and Mathematics at John Carroll, was struck by the revelations. “It [the documentary] is the first thing that I have seen that actually helps to explain the current state of politics in this country,” he said. “My area of expertise is swarm intelligence, designing programs to solve problems by mimicking the behaviors of biological groups — ant colonies, flocks of birds, schools of fish. It deals with intelligent decisions being made by large collections of simple ‘agents.’ There were parts of the documentary that overlapped with some of these ideas, like changing the opinions and behaviors of a portion of the population to get more profit or generate conflict.”

As election news and information rolls in, think about the validity of the information you observe, most notably on platforms like Facebook. Don’t trust social media and technology companies to have your best interest at heart when their algorithms only feed the most profitable information to you, regardless of veracity. These companies are putting our democracy at risk by following this algorithm-driven model of operation. For more information, visit the Ledger of Harms page on the Center for Humane Technology’s website, which details the studies discussed in this column and provides additional resources.

The Carroll News • Copyright 2024 • FLEX WordPress Theme by SNOLog in

Comments (0)

The Carroll News allows comments on articles to foster healthy, thought-provoking discussion. Comments are expected to adhere to our standards and to be respectful and constructive. As such, we do not permit the use of profanity, foul language, personal attacks, or the use of language that might be interpreted as libelous. Comments are reviewed and must be approved by a moderator to ensure that they meet these standards.
All The Carroll News Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *