In 2017 Matthew Herrick sued the dating app Grindr after his ex-boyfriend used fake profiles to harass him and send hundreds of strangers to his home. Herrick’s lawyer, Carrie Goldberg, argued Grindr had made a defective product because the company said it couldn’t stop the harassment. The case was dismissed under Section 230 of the Communications Decency Act, a decades-old federal law that has shielded online platforms from liability for users’ posts.
Goldberg and other lawyers appealed repeatedly and lost. But in the nine years since Herrick’s suit, courts have begun to entertain a different argument: that tech companies can be held responsible for how they design and monetize their products. Those design choices, advocates say, can cause harm and should be subject to product-liability-style claims — an approach inspired in part by the legal campaign against Big Tobacco.
There have been early signs of change. In 2021, Goldberg sued Omegle over alleged child sexual exploitation facilitated by the site; the company later shut down after a settlement. An appeals court that year also allowed a suit against Snapchat to proceed, rejecting the company’s Section 230 defense in a case tied to deadly car crashes involving a speed filter. Snapchat settled the case in 2023.
Last week brought the highest-profile victories yet for this legal strategy. In separate jury trials, a Los Angeles jury found Instagram owner Meta and Google’s YouTube deliberately designed their apps to be addictive and that this contributed to the mental-health struggles of a young woman who began using the platforms as a child. The jury awarded her $6 million. In New Mexico, a jury ordered Meta to pay the state $375 million for failing to protect young users from child predators; a second phase of that trial will consider whether Meta created a public nuisance, and the attorney general has said he will seek court-ordered changes to Meta’s apps.
Advocates and plaintiffs’ lawyers say the rulings mark a turning point. “This is the dawn of a new era, with people finally getting to hold tech platforms responsible for the harms they cause,” Goldberg said. Sarah Gardner of the Heat Initiative, which focuses on online child safety, said the verdicts change the playing field and could create momentum both in courts and in policymaking.
Meta and Google plan to appeal. Meta argues teen mental-health problems can’t be tied to a single app; Google contends YouTube is not social media. Many observers expect the legal theory to end up before the Supreme Court.
Meanwhile, thousands of related cases are winding through state and federal courts. Moody’s has identified more than 4,000 pending suits targeting 166 companies that allege addictive software design. Plaintiffs have expanded the approach beyond social media: suits are being filed against video game makers, online gambling apps, and makers of AI chatbots.
One case filed in Massachusetts accuses sports-betting apps DraftKings and FanDuel of designing products that encourage compulsive betting by using personalized bonuses and nudges that lure users back. “It’s personalized itself to you,” said Jennifer Hoekstra, who is representing the plaintiff. DraftKings said it will vigorously defend against the lawsuits; FanDuel did not respond to requests for comment.
Lawyers representing victims and public-interest advocates say these legal victories, even with modest financial awards relative to the tech giants’ valuations, send an important signal. “If you grab them by the pocketbook, their hearts and minds will follow,” said Matthew Bergman of the Social Media Victims Law Center, which represented the plaintiff in the Los Angeles trial. His firm has also sued OpenAI and other AI chatbot makers, alleging the products have contributed to mental-health crises and suicides. OpenAI has said it is working with mental-health experts to improve chatbot responses to signs of distress.
Advocates also hope the courtroom wins will spur legislative action on long-stalled tech regulation. Gardner noted that changing an industry often requires multiple pressures at once: litigation, regulation, public pressure and shifts in market incentives. The central goal, many plaintiffs say, is to “internalize the cost of safety” so companies change product design to reduce harm.
The lawsuits could reshape legal doctrines that have long limited platform liability, forcing courts to grapple with whether and how design choices — algorithms, engagement features, personalized prompts and monetization strategies — translate into legal responsibility when they cause foreseeable harm to users, especially children.
Meta and Google maintain their defenses and will pursue appeals. But the cascade of new cases — and the recent jury findings — have already altered expectations for how courts, regulators and companies may approach accountability for digital harms. For plaintiffs, advocates and lawmakers, the question now is whether these early verdicts will lead to broader changes in corporate behavior, industry practices and public policy.
Google is a financial supporter of NPR.