A landmark legal case unfolding in Los Angeles is scrutinizing the potentially addictive nature of social media platforms. K.G.M., a 20-year-ancient woman, is suing Meta (Instagram and Facebook), Google (YouTube), Snap Inc. (Snapchat) and TikTok, alleging that the design of these platforms intentionally fostered addiction, contributing to her mental health struggles. The case, considered a bellwether for hundreds of similar lawsuits across the United States, centers on whether social media companies should be held liable for the psychological harm experienced by users.
K.G.M. Began using social media at a young age – YouTube at six, Instagram at nine, Musical.ly at ten, and Snapchat at eleven – and claims features like infinite scrolling and algorithmic content recommendations led to anxiety, body dysmorphia, and depression. Her lawsuit alleges that the companies created “traps” for young users, prioritizing engagement over well-being. This trial is testing the legal boundaries of platform responsibility in the digital age, potentially reshaping how social media companies design and market their products.
Snapchat and TikTok Settle, Meta and Google Prepare to Defend
Prior to the start of jury selection, Snapchat and TikTok reached settlements with K.G.M., the terms of which remain confidential. Although, Meta and Google are proceeding to trial, vigorously defending their platforms. Meta CEO Mark Zuckerberg is scheduled to testify this week, according to reports. The outcome of this case could have significant financial and regulatory implications for the tech industry, drawing parallels to past litigation against tobacco companies, where firms ultimately paid billions in health-related costs and restricted advertising.
The core of K.G.M.’s argument rests on the assertion that the platforms’ design choices actively encouraged compulsive behavior. She testified that Instagram filters contributed to body image issues, and that algorithms provided harmful advice, such as recommending a diet of only one cucumber per day for weight loss. Her attorney, Joseph VanZandt, emphasized that the pervasive use of social media fundamentally altered the course of her childhood.
Section 230 and the Question of Platform Liability
A key legal hurdle in the case revolves around Section 230 of the Communications Decency Act, a law that generally shields social media platforms from liability for content posted by their users. Section 230 has been instrumental in the growth of the internet, allowing platforms to operate without being held responsible for the actions of individuals using their services. However, Judge Carolyn Kuhl ruled that Section 230 does not protect companies from liability stemming from the design of their features, stating that it doesn’t preclude accountability for harm resulting from those design choices. This ruling is a significant departure from previous interpretations of the law.
Meta argues that K.G.M. Faced pre-existing challenges and that her mental health issues were not solely caused by social media use. They contend that evidence will demonstrate she experienced significant difficulties prior to using these platforms. Instagram chief Adam Mosseri testified, according to The Recent York Times, that social media platforms do not make users “clinically addicted,” but can be habit-forming, similar to watching a television series. Meta too points to measures implemented to protect young users, such as age-appropriate accounts and parental controls.
European Scrutiny and Global Concerns
The legal battle in the U.S. Mirrors growing concerns internationally. The European Union is currently investigating TikTok for potentially addictive mechanisms, including personalized recommendations and autoplay features. Preliminary findings suggest the platform violates EU law by constantly “rewarding” users with new content, encouraging endless scrolling, as reported by the BBC. Australia has implemented a strict ban on social media for individuals under the age of 16, effective mid-December.
YouTube’s legal team has attempted to distance the platform from the “social media” label, arguing it functions more as a video streaming service akin to Disney+ or Netflix. This distinction is a strategic move to potentially avoid the same level of scrutiny as platforms focused on social interaction and user-generated content.
The outcome of the K.G.M. V. Meta et al. Case will likely set a precedent for future litigation against social media companies. Even if K.G.M. Doesn’t prevail, the trial has already brought significant attention to the potential harms of social media and the responsibility of platforms to protect their users. The legal landscape surrounding social media is rapidly evolving, and further regulatory action and legal challenges are anticipated as policymakers and advocates grapple with the complex issues of addiction, mental health, and platform accountability.
What are your thoughts on the role of social media companies in protecting the mental health of young users? Share your opinions in the comments below.