Research Poster: Facebook is Causing Detrimental Harm to Humanity
An extended abstract and research poster about the harm that Facebook is causing humanity.
Abstract
The service we now know as Facebook is a multi-billion dollar corporation. For a lot of the world it is their primary or only news source. It is information and history. It is science and current events. Politics. Religion. Friends. Family. Business. To me, Facebook is something else entirely. It is a series of seemingly never-ending questions. For example, have you ever stopped to ask yourself if is it truth? By that I mean is all of what you see, read, hear and watch on Facebook accurate? [16, 23] Why does it show you the ads it shows you? [1] What about its suggestions for friends to add? Does being on Facebook make you mad? Does it make you feel inferior? What about sad? Have you ever had the impulse to ‘quit’ Facebook? Did you go through with it? Did you come crawling back for fear of missing out?
Author Keywords
Facebook; Social Networks; Data Privacy; Health; Regulation.
ACM Classification Keywords
K.4.1 Public Policy Issues; K.4.2 Social Issues; K.5.2 Governmental Issues;
Introduction
In 2003, a bored college student created FaceMash, a “hot or not” website using pictures of his fellow coeds. The premise of FaceMash was to encourage its visitors to compare two female student pictures side-by-side and let them decide who was or wasn’t visually appealing. Within months the concept of FaceMash evolved into something more mature, and in early 2004 “TheFacebook” was launched with the purpose of digitally connecting people at Mark Zuckerberg’s Harvard University campus. TheFacebook would eventually extend membership to its free online service to other colleges, drop the “The” from its name, and open its doors to the general public. Then it went public. And data, privacy, politics, ethics, our health, and even humanity have never been the same.
Harmful for Emotional, Intellectual, Mental, and Social Health
There have been numerous reports and studies that show using Facebook’s service can be a contributor to emotional, intellectual, mental, and social health problems. Some of these studies describe increases in anxiety, stress, depression, inferiority, anger, or all of the above [4, 9, 14, 15, 22].
But even without the backing of detailed studies and research, I believe that many Facebook users have experienced one or more of those types of feelings. Simply ask them. Before I quit the service and deleted my account, I had certainly at various times felt these range of emotions.
Adults are one thing, now let’s discuss Facebook in the terms of younger and more impressionable minds. If you are a parent or a teacher, you already know how
Facebook’s three core software offerings (Facebook, Instagram, and WhatsApp) can be the conduit through which teenagers’ thoughts, behaviors, and moods are heavily influenced. Do you consider whether it is helpful or harmful for our kids to have these types of things in their life? If permitted, at what age should they be allowed to use them? When they break rules, do they get grounded from these services as punishment? What is their reaction? Do they create accounts that you don’t know about? Do they lie about their age to sign up? Who is to blame for kids under the age of 13 (Facebook’s stated age requirement) for creating an account? Why is there not better verification for this account creation process?
On the afternoon of 2019, May 15th, a 16-year old girl shared poll on Instagram to her followers asking them to help her choose life or death. 69% of the respondents voted for “D,” meaning death. Five hours after posting the poll, she was found dead [26].
Young, impressionable, still-forming minds are not intended to constantly consume the images and words that their peers, movie stars, or super models post [25]. Our kids are getting caught in a cycle of constant and continuous comparison and inferiority; compounding this cycle is what is known as fear of missing out, (FOMO) which drives the need to constantly “check our phones” to see who has posted what, whom replied to whom, or how many likes were received on one of your own postings. When you get positive feedback on social media, you experience little, micro-sized doses of dopamine that light up your pleasure center, and these feelings make you come back again and again and again. When the feedback you receive is constantly negative or harmful, it can destroy your self-confidence and sense of self worth. Amplify little words of hurt and hate many times over and over, and any human being — child, teenager, or adult — could lose their sense of purpose.
A News Feed of Propaganda and Lies
Another primary way that Facebook is causing harm to the world comes in the form of disinformation. You didn’t hear too much about this damaging aspect of Facebook until the United States’ presidential election of 2016 was in full swing [20]. As we learned in more detail in the year(s) following the election, America’s Facebook users were trolled by Russia’s Internet Research Agency — the Kremlin-backed enterprise whose primary purpose is to spread disinformation and fake news throughout the Internet. By posing as Americans on Facebook, they created fictitious accounts, group pages, and ads which they then used to sow seeds of discord and unrest by posting politically-related content that was totally fabricated [2, 16]. The fake news created by these accounts was able to spread even quicker by utilizing Facebook’s “boosted posts” feature so that their ads would appear mixed and mingled in with users’ friends’ posts in their news feeds. These posts normally had a call to action such as “Share if you agree” or a request to join the fake group.
These fake accounts, fake groups, and fake ads might not have ever happened if Facebook did actual, proper vetting and had a rigorous check and balance system to ensure users and content are real. A simple way to do this would be to have to have any account, post, ad, etc. verified and/or moderated before it can be seen on the site.
But Facebook doesn’t do it this way. It is so lax in moderation that they even let you post videos in real- time. Earlier this year, the New Zealand mosque shooting was live-streamed on Facebook and video of it was left up for almost an hour after it happened [8].
The Arguments for Regulation and Being Broken Up
With all of the damage that Facebook — and other social media and technology companies — can cause or help contribute to, we need our lawmakers (along with leaders in the tech industry, law enforcement, data scientists, ethicists, and others) to begin laying the groundwork of rules and regulations for how companies impact our digital and physical livelihood. Other countries (notably the EU) have taken large steps in the right direction when it comes to their citizens’ data privacy and protections. With each new data breach or privacy gaffe or “I’m sorry, we need to do better” statement by Facebook, it becomes increasingly clear that the United States should be striving to do the same as other countries, and more, to protect citizens from these digital dangers. As the country that birthed the Internet and Facebook and other tech giants, the United States should be taking the lead on standing up for the digital rights of its inhabitants, not taking the back seat. We need impactful legislation that will protect citizens’ digital lives, starting with a digital consent law. These laws should protect user data, rights, privacy, and ownership of our information and not allow it to be sold, shared, or transmitted without our full consent.
In early 2019, Facebook set aside $3 billion for an expected, upcoming fine to be levied by the Federal Trade Commission (FTC) over privacy violations, violations that broke a 2011 consent decree agreed to in 2012 stating that Facebook was not to collect personal data and share it without user consent. A fine of this size would be the largest ever levied against a US corporation, but for an organization that made over $15 billion in the quarter it budgeted for this fine, is it really a penalty or deterrent for them to stop? Extrapolating that $15 billion number over four quarters and you have a fine equal to 5% of the company’s annual profit. That is not the right punishment for the crime. Financial penalties to reign in Facebook need to be much, much more severe for them to make an impact.
If you think the calls to regulate Facebook are overblown, you need look no further than the CEO of Facebook himself, Mark Zuckerberg, who in late March of this year wrote an op-ed in The Washington Post essentially begging the United States government to step in when it comes to Internet regulation and data privacy laws. The company cannot — or more accurately, will not — police itself.
High-Ranking Facebook Employees are Leaving, Speaking Out
While the public calls for regulation and/or for the breakup of Facebook are getting louder, there are a handful of voices that should be heard more loudly and clearly than others — those belonging to former Facebook employees. Mike Krieger and Kevin Systrom founded Instagram in 2010, and in 2012 they sold the company to Facebook for $1 billion. Facebook was losing users to Instagram, so rather than trying to beat the company with better features or user experience, they bought it. After the FTC approved the purchase, the two originators of Instagram joined the Facebook team. Nearly six years later in late 2018, both had resigned [3].
A similar set of circumstance took place previously when the creators of WhatsApp, the messaging application created by Brian Acton and Jan Koum, sold bought by Facebook for $16 billion in 2014. In March 2018, after Acton left the company, he posted a short tweet that said simply “It is time. #deletefacebook” [18].
There are numerous other ex-Facebook employees who are starting to speak out against what the company is doing and how it operates, including the former head of growth, Chamath Palihapitiya. He said in a late 2017 talk at the Stanford Graduate School of Business: “I think we have created tools that are ripping apart the social fabric of how society works” and encouraged the audience to take a “hard break” from social media [24].
Other high-ranking people to leave the company in early 2019 include WhatsApp VP Chris Daniels, and Chris Cox, who served as Chief Product Officer and had been with Facebook since 2005 [12]. And just recently, on 2019, May 6th, Chris Hughes, one of the original co- founders of Facebook wrote an extensive op-ed piece in The New York Times titled “It’s Time to Break Up Facebook” [10].
Conclusion
After years of numerous security-related missteps, data privacy issues, user account breaches and thefts, [5, 6, 7] broken promises, and questionable ethics [21], we have more than enough proof that Facebook and its subsidiary companies Instagram and WhatsApp are causing harm to humanity in numerous ways. We have no reason to continue to so voluntarily trust an organization that does these things. We cannot so willingly and mindlessly continue to provide it with all of our thoughts, pictures, videos, live-streams, and emotions. Facebook captures this information, organizes it, and ultimately sells it as advertising, where it becomes a weapon on our senses and is used as attack armaments that try to tear down what we believe, or on the flip side, provide such a deep echo chamber that the mind becomes numb to pure analytical thought and processing. If we see the same headlines in Facebook’s “News” feed (and I use that term loosely) over and over again, we are conditioned to consume it as reality. We scroll, scroll, scroll our way through life, looking for what? Knowledge? entertainment? Certainly not truth [23].
We all need to stop and consciously think about how much influence this company has on our lives, and more importantly, how much control the company has of our digital footprint, digital privacy, and our minds. We need to wake up. We need to pay attention and understand that Facebook went from being a social tool that was intended to keep people in contact with one another to an advertising juggernaut, or more accurately: a data collecting, privacy-demolishing, mind-melding-destroyer-of-critical-thinking wrecking ball that cannot or will not be stopped. It chooses to not regulate itself. It is asking the government to do it. Former employees speak about the harm the company has caused and continues to cause.
From an ethical standpoint, Facebook went from bad to worse when they went public, and the need for revenue began to drive the decisions behind what the service is and what it does. When user growth stalled or the competition was too tough to beat, Facebook simply bought them. With Facebook, Instagram and WhatsApp, one company has its hooks into humanity’s thoughts and emotions in the form of text posts, chats, pictures, videos, ads, comments, likes, replies, and any and all other data that people give up freely in the name of sharing, and they use that data to make money. This is a dangerous business model on many fronts, and Facebook is causing detrimental harm to humanity by operating this way.
References
- Ali, Muhammad, Piotr Sapiezynski, Miranda Bogen, Aleksandra Korolova, Alan Mislove, and Aaron Rieke. “Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes.” arXiv preprint arXiv:1904.02095 (2019).
- Cadwalladr, Carole, and Emma Graham-Harrison. “Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach.” The Guardian 17 (2018).
- Dwoskin, Elizabeth “Instagram co-founders resign in latest Facebook shake-up” The Washington Post (2018).
- Flick, Catherine. “Informed consent and the Facebook emotional manipulation study.” Research Ethics 12, no. 1 (2016): 14-28.
- Frier, Sarah, Day, Matt, and Eidelson, Josh “Millions of Facebook Records Found on Amazon Cloud Servers” Bloomberg (2019).
- Glaser, April “Another 540 Million Facebook Users’ Data Has Been Exposed” Slate (2019).
- Glaser, April “Why Facebook’s Latest Privacy Snafu Is Particularly Gross” Slate (2019).
- Hoyle, Rhiannon, and Mandhana, Niharika “Facebook Left Up Video of New Zealand Shootings for an Hour” The Wall Street Journal (2019).
- Hoyle, Roberto, Das, Srijita, Kapadia, Apu, Lee, Adam J., and Vaniea, Kami. Was my message read?: Privacy and Signaling on Facebook Messenger. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI ‘17). ACM, New York, NY, USA, 3838-3842. DOI: https://doi.org/10.1145/3025453.3025925.
- Hughes, Christopher “It’s Time to Break Up Facebook” The New York Times (2019)
- Hull, Gordon. “Successful failure: what Foucault can teach us about privacy self-management in a world of Facebook and big data.” Ethics and Information Technology 17, no. 2 (2015): 89-101.
- Jowitt, Tom “Facebook Hit By More Senior Resignations” MSN (2019).
- Lazer, David MJ, Matthew A. Baum, Yochai Benkler, Adam J. Berinsky, Kelly M. Greenhill, Filippo Menczer, Miriam J. Metzger et al. “The science of fake news.” Science 359, no. 6380 (2018): 1094- 1096.
- Lee-Won RJ, Herzog L, Park SG. Hooked on Facebook: The Role of Social Anxiety and Need for Social Assurance in Problematic Use of Facebook. Cyberpsychology, Behavior And Social Networking. 2015;18(10):567-574. http://doi:10.1089/cyber.2015.0002.
- Marder , Ben, Adam Joinson, Avi Shankar , and David Houghton. “The extended ‘chilling’ effect of Facebook: The cold reality of ubiquitous social networking.” Computers in Human Behavior 60 (2016): 582-592.
- Martin Flintham, Christian Karner, Khaled Bachour, Helen Creswick, Neha Gupta, and Stuart Moran. 2018. Falling for Fake News: Investigating the Consumption of News via Social Media. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ‘18). ACM, New York, NY, USA, Paper 376, 10 pages. DOI: https://doi.org/10.1145/3173574.3173950.
- Meza, Radu Mihai, and Șerban Nicolae Meza. “A Triadic Formal Concept Analysis Approach to Analyzing Online Hate Speech in Facebook Comments.” BRAIN. Broad Research in Artificial Intelligence and Neuroscience 10, no. 1 (2019): 73-81.
- Newton, Casey “WhatsApp co-founder tells everyone to delete Facebook” The Verge (2018).
- Poulsen, Kevin “‘Beyond Sketchy’: Facebook Demanding Some New Users’ Email Passwords” The Daily Beast (2019).
- Rosenberg, Matthew, Nicholas Confessore, and Carole Cadwalladr. “How Trump consultants exploited the Facebook data of millions.” The New York Times 17 (2018).
- Schechner, Sam, and Secada, Mark “You Give Apps Sensitive Personal Information. Then They Tell Facebook.” The Wall Street Journal (2019).
- Settanni, M., & Marengo, D. (2015). Sharing feelings online: studying emotional well-being via automated text analysis of Facebook posts. Frontiers in psychology, 6, 1045.
- Smith, Aaron. “Many Facebook users don’t understand how the site’s news feed works” Pew Research Center (2018) https://www.pewresearch.org/fact- tank/2018/09/05/many-facebook-users-dont- understand-how-the-sites-news-feed-works/
- Vincent, James “Former Facebook exec says social media is ripping apart society” The Verge (2017)
- Walker, M., Thornton, L., De Choudhury, M., Teevan, J., Bulik, C. M., Levinson, C. A., & Zerwas, S. (2015). Facebook use and disordered eating in college-aged women. Journal of Adolescent Health, 57(2), 157-163.
- Webb, Kevin, “A teen in Malaysia reportedly killed herself after posting a poll that asked her Instagram followers to help her choose life or death” Business Insider (2019)
- Zimmer, Michael. “‘But the Data Is Already Public’: On the Ethics of Research in Facebook.” Ethics & Information Technology, vol. 12, no. 4, Dec. 2010, pp. 313–325. EBSCOhost, https://doi:10.1007/s10676-010-9227-5.