Back to all stories
Blackview A60 Andoid Go mobile phone smartphone cell phone in a dedicated folio case cover with magnetic clip.Mobile device view: Wikimedia (June 2020) makes it difficult to immediately view photo groups related to this image. To see its most relevant allied photos, click on this uploader's Photos.
Photo: Acabashi | CC BY-SA 4.0 | Wikimedia Commons

Back‑to‑Back Verdicts Find Meta and YouTube Negligent for Youth Harms, Raising Questions About Algorithmic Promotion of Harmful and Hateful Content

Two back‑to‑back jury verdicts this week found major platforms liable for youth harms: a New Mexico trial concluded Meta violated state consumer‑protection law, ordering $375 million after jurors found the company misled users and “unconscionably” endangered children (in part based on an undercover probe), and a Los Angeles case held Meta and YouTube negligent under a product‑design theory, awarding $6 million to a plaintiff who said algorithms and features addicted her as a teen. Both cases centered on engagement‑driven algorithms, infinite scroll and other design features aimed at keeping users hooked — a legal strategy intended to sidestep Section 230 — and have prompted warnings that the rulings could spur appeals, regulatory changes (age verification, content‑safety mandates) and closer scrutiny of algorithmic promotion of harmful and hateful content.

Social Media and Child Safety Courts and Consumer Protection Meta and Social Media Liability Child Online Safety and Consumer Protection Meta and Child Online Safety

📌 Key Facts

  • A New Mexico jury found Meta violated state consumer‑protection law—ruling the company misled consumers, engaged in 'unconscionable' trade practices, and harmed children’s mental health and safety—and ordered Meta to pay $375 million in civil penalties tied to thousands of violations; the case relied in part on a 2023 state undercover investigation in which agents posed as minors to document sexual solicitations and platform responses.
  • A Los Angeles jury separately found Meta and YouTube/Google negligent in the design of their platforms, concluding engagement‑driven algorithms and features substantially contributed to a plaintiff’s depression and anxiety; the jury awarded $6 million ($3 million compensatory and $3 million punitive), apportioning roughly 70% of liability to Meta and 30% to YouTube, with a 10–2 split on key questions of negligence and causation.
  • Plaintiffs framed the suits as product‑liability/design‑defect cases—targeting algorithms, infinite scroll, beauty filters and other design features to sidestep Section 230—an approach experts call a potential watershed that could open the door to thousands of similar lawsuits against social‑media and AI firms.
  • Meta and Google say they will appeal (Meta’s legal team vowed to 'aggressively' pursue appeals), dispute that the platforms are to blame for the youth mental‑health crisis, and defend their safety efforts; Meta publicly stated confidence in its teen‑safety record even as its stock showed a modest rise after the New Mexico verdict.
  • The rulings raise broader legal and policy questions—potentially challenging platforms’ reliance on the First Amendment and Section 230 and prompting demands for design changes such as stronger age verification, more aggressive removal of bad actors, and legislative action (including calls to pass the Kids Online Safety Act); New Mexico’s attorney general is also pursuing a second phase seeking public‑program remedies and a public‑nuisance finding.
  • Some companies settled related litigation before trial (Snap and TikTok quietly settled in the Los Angeles matter), while the larger litigation landscape includes more than 40 state attorney‑general suits, parallel federal cases, lawsuits by school districts and local governments, thousands of family suits, and separate wrongful‑death and AI‑related claims.
  • Public‑health figures and advocates reacted by likening certain platform designs to addictive products (with former U.S. Surgeon General Jerome Adams comparing them to cigarettes), urging stricter regulation (including age limits and classroom restrictions), and warning that AI‑generated content may accelerate the spread of hateful content that is already being monetized and evasively circulated.
  • Trial evidence and testimony emphasized internal research and the undercover probe to show companies knew about harms and misled the public; Meta executives rejected the term 'social‑media addiction' at trial but acknowledged 'problematic use,' while plaintiffs argued the platforms prioritized engagement and profits over children’s safety.

📊 Analysis & Commentary (3)

Social Media Hurts Kids’ Brains. Or Maybe Not?
City-Journal by Robert VerBruggen March 30, 2026

"The City Journal essay questions headline claims that social media 'hurts kids’ brains,' arguing the science is weaker than portrayed, that recent legal pushback against platforms is driven partly by moral panic, and that more measured, evidence‑based policy — not sweeping liability or censorship — is the right response."

AIs Are Dumb and Sexist
Stevestewartwilliams by Steve Stewart-Williams March 31, 2026

"An opinion critique arguing that contemporary AI systems are both intellectually shallow and prone to reproducing sexist biases, using recent liability verdicts against major platforms as evidence that algorithmic design and commercial incentives—not just individual bad actors—drive harms that require legal and regulatory remedies."

The Courts Should Treat Social Media Like Cigarettes
The Wall Street Journal by Mark Weinstein April 01, 2026

"The WSJ opinion uses two recent jury verdicts against Meta and YouTube to argue courts should treat social media like cigarettes—holding platforms liable for addictive, harmful design and pushing for legal and legislative remedies to protect minors."

📰 Source Timeline (17)

Follow how coverage of this story developed over time

March 29, 2026
11:57 PM
Ruling against Meta and Google could set the stage for changes in handling hate content
Fox News
New information:
  • Fox article emphasizes that the Los Angeles jury case against Meta and Google/YouTube was structured to sidestep Section 230 by targeting product design and engagement‑driven algorithms rather than user‑generated content itself.
  • StopAntisemitism founder Liora Rez calls the ruling 'monumental' and argues that platforms are 'specifically designing systems that actively spread and, most importantly, monetize and incentivize' hateful content, including antisemitism.
  • The piece highlights how influencers and hate actors use code words (e.g., saying 'unalive' instead of 'kill') to circumvent moderation and notes that advocacy groups with large youth followings are often early to detect these evasions and alert platforms.
  • Rez warns that AI‑generated content is already 'helping to feed antisemitic content across the platforms' with 'very little, if any, oversight,' framing AI‑driven hate dissemination as the next major battleground.
7:33 PM
Ex-Surgeon General says social media needs to be regulated "similar to cigarettes"
https://www.facebook.com/FaceTheNation/
New information:
  • Former U.S. Surgeon General Jerome Adams said on CBS' 'Face the Nation' that social media platforms are 'specifically designed to addict' adolescents and teenagers 'similar to cigarettes' and should be regulated accordingly.
  • Adams explicitly linked current social media use in youth to increased anxiety, depression, reduced sleep, mental health problems, and obesity, citing Surgeon General Vivek Murthy’s prior advisory.
  • Adams cited Australia's under‑16 social media ban as a model and said more U.S. states should restrict social media and cell phone use in classrooms, noting about 25 states are discussing or have such legislation.
  • The piece ties Adams's comments directly to the recent jury decisions against Meta and YouTube, framing those lawsuits as evidence that platforms were allegedly designed to addict children, akin to historic cigarette marketing.
March 27, 2026
3:10 PM
Meta vows to 'aggressively' fight after landmark verdicts find tech giant liable for addicting kids
Fox News
New information:
  • Meta Chief Legal Officer C.J. Mahoney says the company will ‘aggressively’ pursue appeals of both the California and New Mexico verdicts and calls them ‘vulnerable on appeal.’
  • Mahoney publicly rejects the premise that Meta is responsible for the teen mental‑health crisis, arguing that responsibility also lies with parents and schools and that blaming tech alone ‘simplifies the problem.’
  • Plaintiffs’ lead attorney Mark Lanier, speaking on Fox, characterizes the California verdict as a ‘major victory’ and says companies must be held accountable for ‘purposefully addicting children’ to enrich themselves.
  • The Fox piece reiterates that TikTok and Snap settled out of the California case before it went to the jury.
3:38 AM
Woman whose son died from drugs bought on social media celebrates verdicts against Meta, YouTube
ABC News
New information:
  • Identifies Colorado mother Kimberly Osterman, whose 18‑year‑old son Max died in 2021 after buying a fentanyl‑laced pill via Snapchat, and notes she has filed a separate wrongful‑death lawsuit (not part of the LA or New Mexico cases).
  • Confirms the dealer, Sergio Guerra‑Carrillo, was sentenced in 2023 to six years in prison on two distribution charges for the pill that killed Max.
  • Reports that Snap Inc. and TikTok both quietly settled in connection with the Los Angeles case just before or as trial began, for undisclosed sums.
  • Quotes Osterman, a member of Parents for Safe Online Spaces (ParentsSOS), explicitly tying the verdicts to platform design choices and calling for strong age‑verification and broader guardrails, and for passage of the Kids Online Safety Act.
  • Adds that Snapchat says it uses technology to proactively find and shut down drug dealers’ accounts and blocks drug‑related search terms, though it declined comment on this specific case.
March 26, 2026
7:44 PM
Why this week's social media verdicts could hold tech giants to account
https://www.facebook.com/CBSMoneyWatch/
New information:
  • Reports that a New Mexico jury on Tuesday ordered Meta to pay $375 million in civil penalties for failing to protect young users from predators and misleading them about the safety of its apps.
  • Clarifies that the Los Angeles case is the first time plaintiffs have won a judgment against social‑media platforms based on product‑design features (algorithms, infinite scroll, etc.) rather than third‑party content, with experts explicitly describing it as a product‑liability theory.
  • Quotes Public Citizen’s J.B. Branch calling this a “watershed moment” and “the crack that could potentially open the floodgates” for accountability Americans have been seeking.
  • Quotes plaintiff lawyer Matthew Bergman saying product‑liability is “the path forward” and that his firm has filed about 1,500 similar cases on behalf of families alleging social‑media harms.
  • Quotes law professor Jess Miers saying we should expect most future cases against online services and generative‑AI firms to be framed as product‑liability suits.
  • Notes that multiple families have already filed lawsuits alleging AI chatbots contributed to users’ suicides, and that the verdicts may increase scrutiny of AI tools from companies such as OpenAI and Anthropic.
12:12 PM
Jury finds Meta and YouTube liable in landmark social media addiction trial
https://www.facebook.com/CBSMornings/
New information:
  • CBS explicitly describes the platforms as being found liable for creating services designed to be addictive for kids and for failing to warn them.
  • The segment reiterates that Meta and Google intend to appeal the verdict, framing it as a planned legal step rather than a possibility.
  • Confirms the $6 million total damages figure in the Los Angeles case in the context of a 'landmark' social‑media addiction trial.
11:16 AM
Iran rejects U.S. peace plan. And, jury finds Meta, Google to blame in addiction trial
NPR by Brittney Melton
New information:
  • NPR reports that a California jury found Meta and Google negligent in a case brought by a young woman, identified as Kaley, who alleged their platforms contributed to her depression and anxiety via compulsive use starting in adolescence.
  • The verdict awarded Kaley $6 million total—$3 million in compensatory damages and $3 million in punitive damages—with Meta held responsible for 70% of the total.
  • The article emphasizes that plaintiffs’ lawyers framed the case around specific product‑design features—algorithms, infinite scroll, and beauty filters—arguing that these made the apps "defective products" that kept her "glued" to her phone.
  • NPR underlines that this verdict is seen as a rare instance of a jury holding Silicon Valley companies legally accountable for their role in a broader youth mental‑health crisis, potentially signaling exposure to similar suits.
  • The piece reiterates that this approach sought to sidestep Section 230 by focusing on design and product‑defect theories rather than third‑party content, a legal strategy being closely watched in other jurisdictions.
March 25, 2026
9:33 PM
Juries find social media platforms are harming teens’ health
The Christian Science Monitor by Stephen Humphries
New information:
  • Within 24 hours of the New Mexico verdict, a separate Los Angeles jury found Meta and YouTube negligent in the design of their platforms and concluded that negligence was a substantial factor in harming the mental health of a 20‑year‑old plaintiff identified as Kaley.
  • The LA jury split 10–2 on questions of Meta’s negligence, causation, and knowledge of Instagram’s dangers to children, and awarded $3 million in compensatory damages, assigning 70% of liability to Meta and 30% to YouTube.
  • The article provides additional detail on the New Mexico case framing it as the first time a state has prevailed against a tech company for harming minors, with New Mexico’s AG calling it a 'historic victory,' and includes Meta’s on‑the‑record commitment to appeal both verdicts.
2:52 PM
What's next in social media legal battles after a New Mexico jury finds Meta platforms harm children
PBS News by Kaitlyn Huamani, Associated Press
New information:
  • The PBS/AP piece explicitly frames the New Mexico verdict as the first jury result in a series of social‑media child‑safety trials scheduled this year across state and federal courts.
  • It details that lawsuits have been filed not just by states but also by school districts, local governments, the federal government and thousands of families, all seeking to hold companies liable for alleged harms to children.
  • The article emphasizes that outcomes in this and related cases could challenge platforms’ reliance on the First Amendment and Section 230 of the Communications Decency Act, potentially forcing design changes that could cost users and ad revenue.
  • New Mexico Attorney General Raúl Torrez is quoted as seeking stronger age‑verification and more aggressive removal of bad actors from Meta’s platforms, clarifying the types of remedies states are likely to push for beyond monetary penalties.
  • Meta’s reaction is updated: the company issues a statement saying it will appeal, arguing it works hard to keep people safe and is transparent about the difficulty of identifying and removing bad actors and harmful content.
1:09 PM
New Mexico jury finds Meta violated consumer protection law in landmark trial
https://www.facebook.com/CBSMornings/
New information:
  • CBS piece reiterates that Meta has formally said it will appeal the verdict, emphasizing this as the company’s immediate response.
3:27 AM
New Mexico jury says Meta harms children's mental health and safety, violating state law
ABC News
New information:
  • Article emphasizes that jurors explicitly found Meta engaged in 'unconscionable' trade practices that took advantage of children’s vulnerabilities and inexperience.
  • Juror Linda Payton, 38, is quoted saying the panel compromised on the estimated number of affected teens but chose the maximum $5,000 penalty per violation because she believed each child was worth the maximum.
  • Story underlines that the second phase of the case in May will ask the judge to decide whether Meta created a public nuisance and should fund public programs, not just pay penalties.
  • Meta notes its stock was up about 5% in early after-hours trading after the verdict, signaling investors are largely shrugging off the $375 million award.
  • The piece further details New Mexico’s undercover investigation in which agents posed as children, documenting sexual solicitations and Meta’s response, as core evidence in the case.
March 24, 2026
11:34 PM
New Mexico jury finds Meta violated consumer protection law in child safety trial
https://www.facebook.com/CBSNews/
New information:
  • CBS segment format emphasizes that this is viewed as a 'landmark' child-safety trial victory against Meta by a state Department of Justice.
  • It reinforces that the New Mexico Department of Justice (Attorney General’s office) is framing this as a test case for using state consumer-protection laws against social media platforms over harms to children.
  • The piece highlights that Meta disputes the ruling and plans to appeal, signaling that this case will likely move into a prolonged appellate fight.
11:11 PM
New Mexico jury finds Meta committed thousands of violations that put children at risk
MS NOW by Ebony Davis
New information:
  • Specifies Meta must pay $375 million in civil penalties tied to thousands of violations.
  • Clarifies that the jury found Meta liable on all counts and that the trial lasted more than six weeks.
  • Details that the case stemmed in part from a 2023 New Mexico undercover investigation creating minor accounts that were quickly exposed to explicit content and predatory behavior.
  • Includes fresh Meta response from spokesperson Andy Stone stating the company will appeal and claiming confidence in its record of protecting teens.
  • Notes a separate California jury in a similar case is currently deadlocked over allegations that Meta’s platforms are addictive and harmful to children’s mental health.
10:46 PM
New Mexico jury finds Meta violated protection law over exploitation claims
https://www.facebook.com/CBSNews/
New information:
  • This CBS piece emphasizes that jurors explicitly found Meta to be 'harmful to children's mental health' and that the company prioritized profits over safety.
  • It clarifies that the jury agreed Meta made false or misleading statements and engaged in 'unconscionable' trade practices by exploiting children’s vulnerabilities and inexperience.
  • The article notes that New Mexico’s undercover investigation involved agents posing as children to document sexual solicitations and Meta’s responses.
  • It highlights that Meta executives at trial refused to concede 'social media addiction' exists but acknowledged 'problematic use.'
  • It situates the New Mexico verdict alongside an ongoing, sequestered federal jury deliberation in Southern California over Meta and YouTube and references more than 40 state AG suits accusing Meta of contributing to a youth mental-health crisis.
10:12 PM
New Mexico jury says Meta harms children's mental health and safety, violating state law
NPR by The Associated Press
New information:
  • The NPR/AP piece explicitly states jurors found that Meta 'knowingly harmed children's mental health' and concealed what it knew about child sexual exploitation on its platforms.
  • It clarifies that the jury agreed Meta’s conduct was 'unconscionable' because it unfairly took advantage of children’s vulnerabilities and inexperience, and that Meta made false or misleading statements about both mental‑health impacts and sexual‑exploitation dangers.
  • The article explains that New Mexico’s case relied heavily on a state undercover investigation in which agents posed as children on Meta platforms to document sexual solicitations and the company’s response.
  • It notes that Meta executives at trial rejected the term 'social media addiction' but acknowledged 'problematic use,' while prosecutors argued Meta prioritized profits over safety.
  • The story situates the verdict alongside more than 40 other state attorneys general lawsuits against Meta and a parallel federal trial in California where a jury is already deliberating Meta and YouTube’s liability.
10:12 PM
Meta Endangered Children, Jury Finds in Landmark Verdict
The Wall Street Journal by Zlati Meyer
New information:
  • WSJ explicitly frames the case as 'among the first' to test whether social-media companies can be held responsible under state consumer-protection laws for harms from content on their platforms.
  • WSJ emphasizes that the jury found Meta liable for 'misleading consumers about the safety of its platforms and endangering children,' highlighting deceptive safety messaging as central to the verdict.
  • WSJ notes Meta has publicly stated it disagrees with the verdict and plans to appeal, via a company spokesperson.