May 4th, 2021:
“If you’re not paying for the product, then you are the product.”
This quote from co-founder of The Center for Humane Technology and Google design ethicist, Tristan Harris when referring to social media networks sets the tone for the fairly new Netflix documentary titled, “The Social Dilemma.” Because people of our generation and of today’s day and age are so fixed on social media, this film has raised concerns among those of us who utilize it regularly. The documentary is an exploration of how tech companies and their social media networks are designed to track their users’ behavior while taking advantage of the psychology of the human brain and modeling it so they can sell targeted ads and create algorithms that keep people glued to their phones. The film expands on how these platforms have had an influence on other societal issues such as teenage depression, family relationships, suicide rates, and more notably, misinformation and political polarization.
Understandably, the interviewees are hesitant before they start talking and even seem a little embarrassed and sorry about what they have created. Some even admit to having fallen prey to their own inventions.
Throughout the film, which puts a huge emphasis on algorithms, tech experts, researchers, and former employees of Silicon Valley companies such as Google and Facebook elaborate on their own creations and the way their platforms were designed to manipulate and hijack our brains into having the constant need to check our phones and scroll through popular social networks. Social media, while meant to bring some positivity and connectivity among us, has controlled us, polarized us, exploited us, and turned our attention into profit, according to this documentary.
While the documentary was able to explain the negative aspects that come with the use of social media, that is all it focuses on. Because humans are naturally social beings, there are plenty of positive aspects to social media as well, such as connecting with others and maintaining distanced relationships, said Chrysalis Wright, PhD and associate lecturer and director of the Media and Migration Lab at the University of Central Florida.
“One thing that consumers and users need to be aware [of] is that there's pros and cons to everything,” said Dr. Wright. “[The Social Dilemma] kind of missed out on that opportunity to give viewers a well-balanced view so that they can make the best informed decision for themselves.”
Moreover, these tech experts offer only a few ineffective solutions to this issue, such as turning off your notifications, deleting overused apps, following people with different opinions to expose yourself to an array of views, and fact-checking something you see online if you plan on sharing it. These are superficial, unconvincing solutions that a lot of millennials and Gen-Z’ers will not subject themselves to, especially because most of them are already aware of the algorithms that bring about addiction and control and still choose to use social media platforms. If social media really is ruining everyone’s life in two seconds, then they should have more to say in terms of resolving the very issues they claim to have created, mostly because this film has not stopped most people (myself included) from using social media every day. For example, social media platforms should better understand how fake news makes its way around on their platforms, said Dr. Wright.
“I think consumers need to be more educated on social media and misinformation and fake news that can be presented on the platform,” said Dr. Wright. “In addition to that, I think that social media companies need to be more responsible for the algorithms that they have set up [which] can be very problematic.”
Misinformation and political polarization are both worth noting given our current political climate, the COVID-19 pandemic, and social networking age. Some of the tech experts in the film attribute the design of their platforms’ ability to create algorithms to the spread of misinformation and to political polarization. Specifically, algorithms were designed to “promote content that sparks outrage, hate, and amplifies biases within the data that we feed them,” says the film’s website. According to the film, the algorithms become increasingly accurate and tailored to the user’s interests, which leads to the appearance of similar content in the feed, whether it is accurate and truthful.
“I think it really goes into the way and the reason why the technology was designed as it was,” said Jeff Orlowski, the director of the film at an event hosted by the Los Angeles Press Club. “It really was an anybody can put anything into the system, regardless of truth, regardless of authenticity and in built into the system or algorithms that were designed to amplify particular content.”
In other words, the tech industry is not regulated like other industries are. This results in the spread of exaggerated misinformation and fake news. According to an article from Stanford’s School of Engineering, fake news spreads like a virus: exposure to numerous pieces of fake news can erode a person’s resistance and make them more susceptible to believing and sharing that misinformation. Fake news also spreads faster through bots, trolls and when they are aimed at a few people who have lots of connections and followers. This exaggerated misinformation seen on every social media platform is customized to fit your views so that you see only what you want to see, intensifying your views to the maximum so that you’re not as willing to tolerate opposing views. All this put together with the fact that misinformation spreads six times faster than true stories, the result is political polarization, said a study done at MIT.
“When we see false information on social media, most of the time it elicits a very emotional reaction from us almost instantly,” said Dr. Wright. “The headlines, the language that's used, the pictures and imagery that's used, all of those things draw our attention to it much easier than a fact-based news story from a reputable news source. They don't tend to use the same type of language and imagery and emotion that a false news story might.”
All this lies in the business model of all these social media platforms, which profit from misinformation, as stated in the documentary.
“They have control over what types of ideas are being released and how they curate their algorithms and how they're twisting the dials,” said Orlowski. “Yet they're not willing to tamper down on the bad stuff because it's bad for their business model.”
Misinformation has played a huge role in our understanding of the Coronavirus and how we deal with it. Now we’re now seeing fake news about the new vaccines. It has also contributed to the polarization of the pandemic. We can see this in all the policies being implemented to curb the virus, in our behaviors toward the virus, and the extreme partisan politics that shape each person’s reaction to the virus.
What’s interesting is that the fake news from earlier in the pandemic did not originally come from citizens in the United States, said Dr. Wright.
“The majority of it actually originates from people who are foreign entities who have a different type of goal,” said Dr. Wright. “Their goal in posting divisive content about shutting down businesses, social distancing, and now we're looking at stuff about the vaccine [which] is to make us argue and fight among ourselves.”
According to a report from the Brookings Institution that used data from the Franklin Templeton-Gallup Economics of Recovery Study, political affiliation shapes people’s understanding of the pandemic, interpretation of facts, and how they respond and engage in safety measures more than other factors that are of more importance such as local exposure and demographic factors that can affect one’s ability to fight off the virus, such as age and other pre-existing conditions. They conclude that economic damage because of the pandemic would have been less severe if it wasn’t for partisan politics combined with sensational media and distortion in the media, which is why their study also states that people who get their news primarily from social media had the most inaccurate perceptions of different aspects concerning the virus.
Even scarier is the fact stated earlier that social media companies aren’t regulated in the same way that other industries are, even as a main source of information and means of reach and distribution, said Larissa Rhodes, producer of The Social Dilemma. Keeping in line with freedom of speech, anyone can publish or post anything, extreme or not. Fixing this issue will start at the individual level, but it’s not going to be enough, said Rhodes.
“I can go to the grocery store and use reusable bags as long as I want, but if they're still producing plastic, that problem is never going to be fixed,” said Rhodes. “I think we really need to look at what the information ecosystem is.”
Sources:
1. Andrews, Edmund. “How Fake News Spreads like a Real Virus.” Stanford School of Engineering, 9 Oct. 2019, engineering.stanford.edu/magazine/article/how-fake-news-spreads-real-virus.
2. Chrysalis Wright. Personal Interview. 27 April 2021.
3. Dizikes, Peter. “Study: On Twitter, False News Travels Faster than True Stories.” MIT News | Massachusetts Institute of Technology, 8 Mar. 2018, news.mit.edu/2018/study-twitter-false-news-travels-faster-true-stories-0308.
4. Jeff Orlowski, Larissa Rhodes. “The Social Dilemma.” 1 February 2021. (event).
5. Jeff Orlowski, Larissa Rhodes. The Social Dilemma. Exposure Labs, 2020.
6. Rothwell, Jonathan, and Sonal Desai. “How Misinformation Is Distorting COVID Policies and Behaviors.” Brookings, Brookings, 30 Dec. 2020, www.brookings.edu/research/how-misinformation-is-distorting-covid-policies-and-behaviors/.