(Bloomberg/Madlin Mekelburg and Maia Spoto) — Meta Platforms Inc., ByteDance Ltd., Alphabet Inc. and Snap Inc. must face trial over claims that they designed social media platforms to addict youths, a judge ruled, clearing the way for the first of thousands of cases to be presented to juries.
Los Angeles Superior Court Judge Carolyn B. Kuhl late Wednesday ruled against the companies on their last chance to avoid trial. Kuhl trimmed a negligence allegation from one case, but allowed other claims to proceed after lawyers have spent years pouring through evidence and dueling over legal theories.
The flood of lawsuits that started about three years ago targets Meta’s Instagram and Facebook, ByteDance’s TikTok, Alphabet’s YouTube and Snap’s Snapchat. A trio of trials starting in January will mark the first time that platform users testify in court about their addiction and suffering.
Related Articles
Apple regains safe-haven status as AI trade looks shakier
Altman says OpenAI doesn’t want a government bailout for AI
DoorDash sees record fall on saying spending will hit profit
Expedia raises full-year outlook on resilient travel demand
Airbnb gives strong outlook in sign US demand is picking up
Thousands of cases have been filed by individuals, school districts and state attorneys general. One group of suits is being supervised by Kuhl. Another is pending in federal court in Oakland, California. If the companies ultimately lose, they could cumulatively face billions of dollars in damages and be forced to change how children use the platforms.
Lawyers for the social media companies argued during a hearing before Kuhl in October that plaintiffs had failed to present enough evidence showing that the designs of each platform – including algorithms that curate content, endless scrolling and personalized notifications – directly caused a litany of harms alleged in the suits.
Young users claim excessive screen time has led them to suffer depression, anxiety, insomnia and eating disorders, while others have engaged in self-harm and even died by suicide.
“We strongly disagree with these allegations and are confident the evidence will show our longstanding commitment to supporting young people,” a spokesperson for Meta said in a statement. “For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most.”
José Castaneda, a Google spokesperson, said his company has developed safeguards to give families more control over platform use.
“These lawsuits fundamentally misunderstand how YouTube works and the allegations are simply not true,” Castaneda said in a statement Thursday. “YouTube is a streaming service where people come to watch everything from live sports, to podcasts to their favorite creators, primarily on TV screens, not a social network where people go to catch up with friends.”
Lawyers for Snap at law firm Kirkland & Ellis said they look forward to explaining at trial why the allegations against the company are wrong.
“Snapchat was designed differently from traditional social media; it opens to the camera, allowing Snapchatters to connect with family and friends in an environment that prioritizes their safety and privacy,” the lawyers said in a statement.
A representative of TikTok didn’t immediately respond to a request for comment.
“These rulings affirm that tech companies must face accountability for the design choices they make — choices that can profoundly affect the mental health of young users,” a group of lawyers representing the youths and families who are suing said in a statement. “We are grateful that the court recognized the importance of letting a jury decide whether these platforms caused harm to these plaintiffs.”
A jury is scheduled to be chosen in January for the first so-called bellwether trial, an opportunity for each side to test the strengths of its arguments and gage how other cases featuring similar claims might play out. The outcomes of these early trials sometimes spur settlement talks.
The first trial will feature a 19-year-old California woman who says the sites’ designs led to her addiction and caused anxiety, depression and body dysmorphia.
The trial is scheduled to begin on Jan. 27. Meta Chief Executive Officer Mark Zuckerberg, Instagram head Adam Mosseri and Snap CEO Evan Spiegel are expected to testify.
The early trials will test the limits of Section 230 of the Communications Decency Act, a federal liability shield law that has protected some social media platforms from facing past user-harm lawsuits.
Lawyers for the companies have argued that they are not responsible for third-party content posted on the app that may harm users. In her decision in one of the bellwether cases, Kuhl said it would be appropriate for a jury to weigh whether design features like “infinite scroll” contribute to the problem.
“The fact that a design feature like ‘infinite scroll’ led a user to harmful content does not mean that there can be no liability for harm arising from the design feature itself,” Kuhl wrote.
The case is Social Media Cases JCCP, 5255, California Superior Court (Los Angeles).
(Updates with comment by Meta and additional details from the ruling. A previous version of the story corrected the day the ruling was issued.)
More stories like this are available on bloomberg.com
©2025 Bloomberg L.P.





