By Claire Lareau
On May 23, current U.S. Surgeon General Dr. Vivek Murthy called for “urgent action on a pressing danger for America’s youth.” He was not talking about vaping, alcohol, or fentanyl. He was warning the nation about social media. Dr. Murthy stated, “We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis—one that we must urgently address.”
The mental health crisis Dr. Murthy warns of impacts more than 1 in 10 of America’s youth. According to Mental Health America, 11.5% of youth experience severe depression. Over 16% of youth experienced a severe depressive episode in the past year. Suicide has become the second-leading cause of death in children ages 10-14. According to CTRLCare, a mental health treatment center in New Jersey, 90% of teens use social media, and this excessive use can lead to short-term and long-term problems.
In March 2021, Meta CEO Mark Zuckerberg defended his company against charges of contributing to the mental health crisis in his testimony before Congress in a hearing about children and mental health. He stated, “The research that we’ve seen is that using social apps to connect with other people can have positive mental health benefits.” However, two months later, in May 2021, one of Zuckerburg’s recently resigned employees, Frances Haugen, began meeting with lawyers to help her release 20,000 internal Facebook documents that stated the contrary. One of these documents, labeled “Teen Mental Health Deep Dive,” displayed a colorful bar chart summarizing that Meta products such as Facebook and Instagram had caused over one-third of teen girls to feel worse about their bodies. Throughout the 20,000 documents, Haugen stated, “The most shocking disclosure was the extent to which Facebook knew its products, Instagram in particular, were harming our children and chose to do nothing about it.”
While Facebook and other social media giants may be contributing to the crisis, they did not set out to make teens feel worse about themselves. How did social media apps that set out to connect ultimately alienate much of America’s youth? The problem is the algorithms. Social media companies have knowingly evolved their underlying algorithms in a way that leads to youth users becoming addicted to their products.
Algorithms were introduced to social media in 2009 through Facebook’s program Edgerank. EdgeRank sorted user feeds by a ranking system: Rank =Affinity x weight x decay. Affinity measured how often a user interacted with a page. Weight ranked the type of post on a hierarchy, with photos and videos getting the highest ranking. Decay was a measure of how old a post was. This three-part system provided users with a cleaner and personalized feed. It filtered out the “low quality,” less relevant content and gave each user a unique experience. Very quickly, Facebook outgrew EdgeRank. It implemented an evolving set of algorithms that are still in place and being regularly revised today. These algorithms utilize a complex profile of each user to manipulate, engage, and entice them.
This algorithm-driven engagement is core to social media companies’ purpose, in that they profit from advertisements. However, these advertisements are not as harmless as those on billboards or televisions. A sophisticated set of algorithms targets ads to users based on their searches, likes, and clicks. Microsoft employee Jaron Lanier, nicknamed the “Father of virtual reality” for his invention of VR technology, is one of the fiercest critics of social media’s targeted advertising. In a 2018 interview with YouTube Channel 4 News from the UK, Lanier said, “When you watch the television, the television isn’t watching you. When you see the billboard, the billboard isn’t seeing you.” Lanier continued, “Society has been gradually darkened by this scheme in which everyone is under surveillance all the time, and everyone is under this mild version of behavior modification all the time. It’s made people jittery and cranky. It’s made teens especially depressed.”
Bo Burnham, comedian, actor, writer, and director of the 2018 coming-of-age movie Eighth Grade, said in a 2022 technology discussion with the Child Mind Institute, “It’s not that [teenagers] think the world in their phone is real. It is that they think the world in the real world is virtual.” He commented that the younger generations do not just live moments; they plan moments to look back on. Media companies use engaging algorithms to push users’ real world farther into the virtual. The more people live through social media, the more money these companies make.
After seeing Haugen’s whistleblowing documents, South Carolina law firm Motley Rice began to represent plaintiffs’ litigation against Meta and other social media giants, including TikTok, Google, YouTube, and Snapchat. The main claim argued that these social media platforms knowingly and willingly allowed their platforms to addict children, leading to severe mental health concerns.
Motley Rice associate attorney Jessica Carroll works closely with clients to give them a voice against social media companies. In an interview with The Match, Carroll explained that many of her clients were Division I athletes who turned to social media for workout routines and recipe ideas. Social media algorithms laid a click path that started with these harmless searches and led to content promoting restricted eating. These young, formerly healthy athletes developed eating disorders, battled depression, and, in some cases, committed suicide.
Social media is not trying to lead youth down that dark path. Carroll says, “It’s not that the negative is more salacious or is promoted more than the positive, but the way that people naturally respond to negative content is different than positive.” Human nature has a way of elevating harmful content, and companies know that. The algorithms are judgment-free robots responding to user clicks in a way that increases the odds that the user will continue to click. Unfortunately, the click patterns that engage users are too often dangerous.
As Carroll and her coworkers have expanded their litigation, they have been blown away by the extent of the harm that companies were aware of. She says, “The more we’ve gotten into it, the more we’ve come to understand how much they knew—and that’s just prediscovery.”
Social media’s current business model relies on knowing and manipulating its users to use the products even more. When asked why social media companies do not change their product, even knowing how dangerous it is, Carroll said, “To restrict their algorithm in any way means to reduce engagement. Engagement is their term for addiction. They hired people who designed casino spaces, where everything is curated, to get people to stay longer and do more. It’s very bright, they pump in fresh oxygen, and they give you free drinks. The whole place is curated to get you to stay there.”
Carroll emphasizes that media companies must regulate these algorithms because the general public does not understand the danger. One of Carroll’s most heartbreaking cases was of the 15-year-old daughter of a math teacher. The girl’s mother helped her create an Instagram account and followed her. She thought she could protect her. She “thought that was enough.” After battling an eating disorder, this 15-year-old girl committed suicide. Carroll says, “These companies experimented on a generation of young people, and we are just now seeing the long-term effects of that.”
The U.S. government has begun to attempt to regulate the epidemic of harmful media content. US Senators Richard Blumenthal (D-CT)and Marsha Blackburn (R-TN) introduced The Kids Online Safety Act (KOSA) in February 2022 and reintroduced the act in May 2023. The KOSA act would provide parents with tools to protect their children on social media, create accountability for media’s harm to kids, and open up critical datasets on algorithms for academic researchers. The bill would prevent companies from using even basic information, like the user’s location, to curate the user’s feed; it has gained support from over 200 organizations in technology advocacy, according to the American Psychological Association.
The U.S. attempts to regulate social media could be stronger when compared to those in the European Union (EU). The EU passed the Digital Services Act in 2022, which outlined a single set of rules to regulate social media across the entire EU. The act works both to create a safer digital environment and to establish “a level playing field to foster innovation, growth, and competitiveness.” The act requires social media companies to be transparent with their statistics. If the company has more than 45 million active users, they must provide an annual risk assessment.
Although Europe is ahead of America in terms of restricting social media’s harm to youth, Google and other social giants earn significantly less money from European users than they do from American users. One reason that European regulation is more advanced is due to cultural differences. In America, users more willingly trade convenience for privacy. Upper School technology coach Rachael Rachau states, “That’s a trade many of us are willing to make—myself included.”
Rachau emphasized that as social media companies grow, humans become less and less involved in the creation process. With the growing popularity of artificial intelligence (AI), algorithms are quickly diverging from human insight. Where humans might see a feature as unethical, the robots cannot make that distinction. Rachau comments, “I live in this world right now, where I am fascinated and obsessed with what artificial intelligence is going to do for us… and what it is going to do to us.”
Rachau suggests that to stay safe on social media, students should first turn off location tracking. She shares that companies use this location information to appeal to users: “Snapchat knows where you live, where you go to school, and where you like to shop. It can make all kinds of assumptions based on that.” Platforms use this information to personalize ads and further entice users. Secondly, students should find ways to limit their screen time. For Rachau, that means setting a limit and sticking to it. “It’s easy to confuse knowing yourself with lying to yourself,” she says. Students often set limits but ignore them, so she recommends giving the time-limit passwords to someone else. With limited and intentional usage, people can use social media for good. Instead of mindless scrolling, Rachau challenges students to think about how they are using a platform to facilitate change in the world.
Similarly to Rachau, Elizabeth Seward (‘25) uses the app Opal: Screen Time for Focus to manage her screen time. She found that traditional screen time limits did not work for her and noticed herself wasting valuable time on social media platforms. She says, “I was addicted to TikTok. I could not even finish my homework, and I realized it was time to make a change.” Opal blocks apps from her for a set time each day. It also limits the time she can spend on each app, so she cannot scroll mindlessly for hours. She recommends that other students download Opal or similar apps to help them with their self-control.
While some students work to battle their screen addiction, others admit to being addicted without attempting to fix it. Madison Lewis (‘25) comments, “I am addicted. It’s hard to get off my phone when I need to do my work. I scroll and scroll for hours, and I usually don’t realize or don’t care how much time I am wasting.” Lewis also admits that she realizes her feed often tends to manipulate her into staying on the app: “Today on TikTok, for example, I liked one video of Christmas list inspiration. Within five minutes, I had already seen three more Christmas list inspiration videos, because TikTok knew that was what I wanted to see. I guess that’s a harmless example, but it’s scary how quickly the algorithm works and how tirelessly it works to keep me scrolling.”
Lewis continues, “I have had platforms like Snapchat since I was ten years old. They keep me connected with peers and entertained, but there are some downsides to using them. For example, My highest Snapstreak is 1382. That means I haven’t gone a day without Snapchat in almost four years. That’s kinda’ crazy.” When asked why Lewis checks her platforms daily, she said, “It is really a necessity. If you aren’t on Snapchat, people will wonder what is wrong with you. There have been times when I have taken a nap in the middle of the day, and I will have had friends calling me worried when they realize how long it has been since I have texted them.” Students seem to have a reliance on each other. That reliance manifests itself through constant communication and information about each other. Lewis says, “I basically know what my friends are doing all the time. That can be hard, because sometimes you don’t want to know they’re out having fun without you.” Snapmaps and other social media features have caused Lewis and others to feel this fear of missing out (FOMO).
My father Mark Lareau says, “My algorithm on X reflects the things I am interested in. Unlike TV, where you have to jump between channels to find one that interests you, my algorithm creates a channel that is customized for me.” While it is convenient to have tailored content, this convenience that keeps my father scrolling is what social media companies want for their product.
My sister Eva Lareau (‘23), a first-year student at the University of Virginia, has noticed unhealthy patterns in her algorithm. She says, “You have to be aware of what social media is showing you. I’ve had moments where I look for oatmeal recipes, and within twenty minutes, my feed has gone from oatmeal to thinspo (thin inspiration), where girls are flaunting that they’ve only eaten 300 calories all day. I’m pretty aware of when this is happening, and I just like other videos to get out of it, but imagine a ten-year-old looking for oatmeal recipes getting trapped in a pit of dangerous content. There are pretty terrifying aspects to the way social media works.”
Although teens can attempt to regulate their usage, a more structural change needs to occur to govern how social media companies use algorithms. Lanier emphasized in his interview with Channel 4 News that there are many alternative models of social media that do not include such high levels of addiction and harm. Social media apps could be paid services like Netflix or a government-funded library. He says, “Once you can see that there are alternatives, you realize how strange it is, and how unsustainable (the current model) is.” Unfortunately, social media companies are so indebted to a revenue-rich model of addiction that they are not changing their models on their own. So long as their models are built on algorithms, social media apps will be addictive. Like with every other addictive substance on Surgeon General Vivek Murthy’s priority list, the time for action is now.
Featured image credit: NARA & DVIDS PUBLIC DOMAIN ARCHIVE.
2 thoughts on “Honors Feature: Effects of Social Media Algorithms”