This is the new MLex platform. Existing customers should continue to use the existing MLex platform until migrated.
For any queries, please contact Customer Services or your Account Manager.
Dismiss

Landmark trial puts social media addiction theory to the test

By Maria Dinzeo

January 29, 2026, 20:57 GMT | Comment
The theory that tech giants intentionally targeted and manipulated young minds for profit will soon be put to the test, with the first in a series of trials set to begin next week in a downtown Los Angeles courtroom.
For years, lawsuits against the major social media platforms have hit the same wall — a federal law that shields them from liability for what their users post. 

But a new legal playbook has taken shape, one that shifts the focus away from content and toward what plaintiffs say are deliberate design choices made by Big Tech to get kids hooked on their products, despite internal warnings about the risks. 

The companies have long maintained that such claims rest on third-party content protected by Section 230 of the Communications Decency Act. The trials will instead turn on the platforms’ design choices, and whether those choices can support negligence claims tied to foreseeable harms.

Jurors will begin grappling with that question next week, as the first in a series of trials opens in downtown Los Angeles.

At the center is a young woman known as K.G.M., who was six when she began using YouTube, eight when she started uploading videos, and nine when she joined Instagram, according to court documents. Snapchat followed at 11 and TikTok at 14.

By age 10, she was checking social media from the moment she woke up — while eating meals, in the bathroom, and late into the night. In high school, she would ask to be excused from class to use the restroom, where she would scroll through Instagram and Snapchat. Her mother observed that her daughter seemed unable to live without her phone, and any effort to curb her use would trigger a confrontation, or, as her sister put it, “a meltdown as if someone had died.”

K.G.M. says she realized by the age of nine or 10 that she was addicted to social media. “I was on it all the time when I didn’t want to be, and I would freak out if my mom tried to take my phone away,” she testified.

The incessant social media use coincided with a cascade of mental health problems. Her anxiety spiked and she developed body dysmorphia. She became obsessed with grooming and hygiene, as she was served a steady diet of posts featuring people “way skinnier” and “way prettier.” Healthy food also became a fixation, along with “what I eat in a day” videos that encouraged eating as little as possible. Cutting videos continually surfaced on her feed, despite her attempts to block that sort of content, and she struggled through suicide attempts in 2019 and 2020. She was also bullied relentlessly online and at school, to the point where a vice principal recommended that she “remove herself from social media to decrease frequency of bullying by peers.”

Now 20 years old, K.G.M. is taking the platforms to court, where her lawyers intend to show that social media’s manipulative algorithm and design features were the wellspring of these harms. “It’s impossible to tell K.G.M.’s story without mentioning social media,” said her lawyer Joseph VanZandt.

— The plaintiff —

K.G.M.’s trial is the first in a succession of bellwethers involving personal injury plaintiffs who accuse Meta Platforms, YouTube, Snap and TikTok of developing products designed to maximize engagement while negligently disregarding foreseeable psychological harms to kids.

Snap and TikTok reached confidential settlements with K.G.M. just before jury selection got underway this week, leaving Meta and YouTube as the sole remaining defendants (see here and here).

“Two down, two to go!” lead plaintiff counsel Mark Lanier told MLex in an email response to a request for comment on the TikTok agreement.

Eight other jury trials have been scheduled for later this year in Los Angeles, as well as a set of three federal jury trials set for this summer in Oakland. TikTok and Snap remain defendants in those cases, along with thousands of other suits brought by state attorneys general, school districts and individuals.

The claims mirror state lawsuits brought against tobacco companies in the 1990s, which alleged that the industry targeted young people, concealed known health risks, and fueled public health problems that burdened the healthcare system.

In November, Los Angeles Superior Court Judge Carolyn B. Kuhl rejected arguments that the companies are shielded by the First Amendment and Section 230 of the Communications Decency Act, finding that features like endless scroll, autoplay, likes and comments, beauty filters and push notifications are not content-based, but design choices that must be weighed by a jury.

Meta Platforms, Google, Snap and TikTok have argued that social media isn’t to blame for K.G.M.’s mental health struggles any more than the brutal social dynamics that marred her high school experience. K.G.M. also witnessed domestic violence in her home at the age of four, according to court documents, an early trauma that could have made her more susceptible to certain social media content.

Lawyers have also sparred in court over whether jurors should be exposed to graphic testimony about sexually suggestive gossip that circulated online about K.G.M.

Kuhl recently ruled that the platforms are entitled to present evidence of alternative causes of K.G.M.’s emotional distress and suicidal ideation, particularly one event involving a rumor that she had engaged in an embarrassing sexual act at school.

Those evidentiary fights underscore what may prove decisive at trial: whether compulsive use of Instagram, YouTube and TikTok caused the anxiety, sleep loss and emotional distress plaintiffs allege, or whether jurors will instead attribute those harms to familiar adolescent conflicts and turmoil at home.

— The evidence —

The six-week trial, scheduled to open on Feb. 3, will feature a bevy of blockbuster evidence, including testimony from Meta CEO Mark Zuckerberg and Instagram CEO Adam Mosseri.

Recently unsealed internal documents and sworn testimony from corporate insiders could be especially damaging as the plaintiffs look to prove that the companies knew for years about the addictive properties of their platforms, yet remained hyper focused on maximizing engagement.

A 2018 report commissioned by YouTube warned that short-form videos deliver a “quick fix” of dopamine, and cautions that over exposure could erode attention spans, disrupt sleep and increase social isolation (see here).

Another internal document, this one from Facebook, details the company’s strategy for dealing with what they call the “well-being problem.” The 2017 document says people tend to use Facebook more than they would ideally want and can’t easily stop, and that Facebook features enable “mindless usage.” The document pushes back against the “narrative” that Facebook is bad for well-being, recasting it as a situational and user-specific problem that can be mitigated through targeted product tweaks rather than fundamental changes to its engagement model. 

A former Meta executive-turned whistleblower is also expected to testify that the company had long-standing internal evidence showing its platforms could cause serious harm to children, including bullying, mental health deterioration and suicide risk. Arturo Bejar, who spent six years working on safety and integrity at Facebook and Instagram, said in unsealed deposition testimony that Meta conducted internal research documenting those risks and that the findings were known at the highest levels of the company.

Bejar also testified that he spoke with parents who believed Instagram had contributed to their children’s deaths, and that in some cases the warning signs were visible on the platform and “eminently preventable.”

— The defense —

Meta and Google have forcefully rejected the claim that they prioritize profit over kids’ safety. On the contrary, they say they’ve deliberately developed product features that limit the amount of time children spend online to the detriment of their bottom lines (see here).

Meta says it has spent $30 billion on safety and security in recent years and currently has a team of 40,000 employees working on safety issues.

The companies also dispute that their products were the cause of plaintiffs’ injuries, pointing to research that says there’s no direct link between social media use and mental health harms.

A National Academies consensus report found that social media has the potential to both harm and benefit adolescent mental health, but notes the lack of robust evidence between mental health outcomes and social media use and the difficulty in measuring that link. But they concluded that there isn’t enough evidence to say that social media causes changes in adolescent health at the population level, and they declined to recommend specific limitations on teens’ social media access.

Research published in the Journal of Public Health tracking more than 25,000 UK adolescents over three years found no evidence that greater time spent on social media leads to higher levels of anxiety or depression.

Studies of this kind are expected to play a central role in Big Tech’s defense, alongside insider testimony that the companies implemented parental controls, time limits and safety measures based on their own internal research into teen usage. They’ll also highlight the benefits of social media, especially the opportunity to find supportive communities, cultivate hobbies and build connections with peers.

— The impact —

Still, the stakes could not be higher. Meta’s quarterly securities filing last year notes the potential for damages in the “high tens of billions of dollars” should the jury find in favor of the plaintiffs (see here). In addition to staggering financial losses, the platforms could be forced to change how they operate, including restricting features that drive compulsive use among kids.

This could significantly undermine their reach, considering that YouTube currently ranks the highest in popularity among teens in a survey by Pew Research, where roughly 1 in 5 teens said they use YouTube “almost constantly.” Instagram wasn’t quite as popular, but 12 percent described their use as almost constant. YouTube and Snapchat also account for nearly half the time eight to 14 year-olds in the UK spent online in 2025, according to an annual report by Ofcom.

Meta has warned investors in its quarterly securities filings that the trials “may ultimately result in a material loss” to the company (see here), a point Meta Chief Financial Officer Susan Li reiterated on an earnings call Tuesday.

But the biggest risk may be that the jury ultimately diagnoses social media itself as the primary driver of a widespread youth mental health crisis.

Please e-mail editors@mlex.com to contact the editorial staff regarding this story, or to submit the names of lawyers and advisers.

Tags