After the $375M Verdict: How Social Media Addiction Lawsuits Are Changing the Legal Landscape
Need a Personal Injury Attorney?
Get matched with pre-screened attorneys in your area. Free consultation, no obligation.
Get Matched Free
Something shifted in late March 2026. A jury in Santa Fe ordered Meta to pay $375 million. A day later, a jury in Los Angeles awarded $6 million to a young woman who said Instagram had stolen her adolescence. Two verdicts, two days, two states — and suddenly a legal theory that courts had long struggled to sustain became the law of the land in ways that matter to millions of families. The social media addiction lawsuits are no longer a legal long shot. They are a movement, and they are changing everything.
Two Trials, Two Verdicts, One Message
The first trial took place in New Mexico's First Judicial District Court in Santa Fe. New Mexico Attorney General Raúl Torrez sued Meta in 2023, arguing that Facebook and Instagram had violated the state's Unfair Practices Act by misleading the public about the safety risks of its platforms and enabling sexual predators to target children. On March 24, 2026, the jury agreed — finding Meta willfully violated state consumer protection law and ordering the company to pay the maximum penalty of $5,000 per violation, totaling $375 million in civil penalties. New Mexico became the first state in the nation to win at trial against a major tech company over child safety.
The following day, a California jury reached a verdict in a personal injury case brought by a young woman identified as K.G.M. She argued that Meta and Google had deliberately designed their platforms — through features like infinite scrolling, face-altering filters, and algorithmically amplified content — to be addictive, and that this addiction had caused her severe anxiety, depression, and thoughts of self-harm beginning in her teenage years. The jury awarded $6 million in total damages, with Meta ordered to pay $4.2 million and Google ordered to pay $1.8 million.
That California case was a "bellwether" trial — a carefully selected test case run ahead of a large volume of similar claims to see how juries respond to the theory of liability. A strong plaintiff verdict in a bellwether case is a signal to the legal system: this argument works. It is now on the record that at least one jury, presented with the full set of facts, found social media companies liable for the addictive design of their own products.
Why These Cases Are Different From What Came Before
Social media companies had long believed they were protected from personal injury claims under Section 230 of the Communications Decency Act (47 U.S.C. § 230), a federal law that generally immunizes online platforms from liability for content created by third parties. For years, lawsuits targeting social media platforms over user-generated content were dismissed under this doctrine.
But the addiction lawsuits are built on a different legal theory. They do not allege that third-party users posted something harmful. They allege that the companies themselves — through deliberate, proprietary design choices — created and deployed features engineered to keep users, including children and teenagers, compulsively engaged. Infinite scrolling that never reaches a natural stopping point. Notification systems calibrated to trigger anxiety and anticipation. Algorithms that amplify emotionally destabilizing content because it generates more engagement. These are not user-generated harms — they are product design decisions.
That distinction matters enormously under Section 230. If a harm flows from the platform's own design rather than a third party's content, the immunity defense weakens significantly. The California verdict confirmed what plaintiff attorneys had long argued: juries understand this distinction, and they are willing to hold companies accountable for it.
Who Is Filing These Lawsuits?
The litigation is happening at multiple levels simultaneously. Individual personal injury suits — like the K.G.M. case in California — are being filed by parents and young adults who allege that platform addiction caused diagnosable mental health conditions including anxiety disorders, depression, eating disorders, and self-harm. These cases are brought in both state and federal court.
State attorneys general are also active. In addition to New Mexico, attorneys general from dozens of states have filed suit against Meta and other social media companies alleging violations of consumer protection laws, children's privacy laws, and public nuisance doctrines. These government-led cases target the platforms' practices as a matter of public policy, not just individual harm.
Hundreds of school districts across the country have filed separate actions, arguing that social media addiction has created a public health crisis in their communities — driving up demand for mental health services, counseling, and administrative intervention at a cost that districts argue the platforms should bear.
Many of the federal individual cases have been consolidated in the U.S. District Court for the Northern District of California under a Multi-District Litigation (MDL) proceeding. State court cases are being coordinated separately in several jurisdictions.
What Families Should Know About Their Legal Rights
If your child or a family member has suffered a diagnosed mental health condition — including anxiety disorder, depression, an eating disorder, or self-harm behaviors — that you believe was caused or significantly worsened by social media use, you may have a viable legal claim. The strength of any individual case depends on a range of factors, including the age at which use began, the platforms involved, the nature and severity of the harm, documented medical treatment, and the timing of when the harm occurred relative to relevant statute of limitations deadlines.
Several factors tend to strengthen individual claims in this type of litigation. Evidence of prolonged, compulsive platform use beginning in early adolescence is relevant. Medical documentation — including therapy records, psychiatric evaluations, or hospitalization — helps establish the severity of harm. Any evidence connecting the onset or worsening of symptoms to specific platform behaviors, such as increased Instagram use following the introduction of a new feed format, can be significant.
Statute of limitations deadlines vary by state and by the nature of the claim. In many states, personal injury claims must be filed within two to three years of discovery of the harm. For minors, the clock may not start running until they reach the age of majority — but this varies by jurisdiction. Speaking with an attorney sooner rather than later is important.
The New Mexico Case: What Happens Next
The New Mexico verdict is not final. Meta has announced it will appeal, and the company released a statement saying it "respectfully disagrees with the verdict" and that it has worked hard to keep users safe on its platforms. A second phase of the New Mexico case is scheduled for May 2026, in which a judge will determine what structural changes Meta must make to its operations — potentially including more robust age verification systems, enhanced mechanisms for removing predators, and limits on encrypted communications that can shield harmful activity from law enforcement.
During the trial, the state presented internal Meta messages showing that a 2019 decision by CEO Mark Zuckerberg to make Facebook Messenger end-to-end encrypted by default would affect the company's ability to share approximately 7.5 million child sexual abuse material reports with law enforcement. That disclosure has heightened pressure on Meta not just from the courts, but from child safety advocates and legislators watching the case.
Speaking of legal matters...
Need Help with Your Case?
Our network of accredited attorneys specializes in cases just like yours. Get a free consultation today.
A Changing Moment for Platform Accountability
The spring of 2026 represents something genuinely new in the relationship between social media platforms and the law. For most of the internet's history, platforms operated with a kind of legal exceptionalism — the rules that applied to product manufacturers, publishers, and service providers did not apply to them in the same way. Section 230 was their shield, and it largely held.
What the New Mexico and California juries did is refuse to accept that exceptionalism as absolute. They looked at the specific design choices these companies made — choices that internal research, as presented in multiple trials, suggested were known to cause harm — and they decided those choices had consequences. That is how product liability law has always worked for physical goods. The question now is whether courts and legislatures will allow it to work the same way for digital products.
For the millions of families who have watched children struggle with anxiety, depression, or disordered eating tied to social media use, the verdicts are not just legal news. They are validation. And for many, they are the beginning of a legal process that they did not know they had the right to pursue.
Frequently Asked Questions
What are social media addiction lawsuits?
These are civil lawsuits alleging that social media companies like Meta and Google deliberately designed their platforms with features — such as infinite scrolling, push notifications, and engagement-maximizing algorithms — that are psychologically addictive, particularly for children and teenagers, causing diagnosable mental health harm including anxiety, depression, eating disorders, and self-harm.
What was the $375 million New Mexico verdict about?
On March 24, 2026, a New Mexico jury found Meta liable under the state's Unfair Practices Act for endangering children on Facebook and Instagram and misleading consumers about the safety of its platforms. The jury ordered the maximum penalty of $5,000 per violation, totaling $375 million in civil damages. It was the first time a state had prevailed at trial against a major tech company over child safety.
What was the California bellwether case?
A California jury awarded $6 million in damages to a young woman who argued that Meta and Google's addictive design features had caused her to develop severe anxiety, depression, and thoughts of self-harm as a teenager. Meta was ordered to pay $4.2 million, and Google $1.8 million. As a bellwether case, the verdict signals how future similar lawsuits may be decided.
Does Section 230 protect social media companies from these lawsuits?
Section 230 of the Communications Decency Act (47 U.S.C. § 230) provides broad immunity for third-party content on platforms but does not necessarily protect companies from liability for their own design decisions. The California verdict found liability based on Meta's proprietary design choices, not user-generated content, which limits the applicability of Section 230 as a defense.
Who can file a social media addiction lawsuit?
Individuals — including adults filing on their own behalf or parents filing on behalf of their minor children — who have suffered a diagnosed mental health condition they believe was caused or significantly worsened by social media addiction may have a potential claim. Eligibility depends on individual circumstances, the platforms involved, the nature of the harm, and applicable statutes of limitations.
What mental health conditions are at the center of these lawsuits?
Claims have primarily centered on anxiety disorders, clinical depression, eating disorders (including anorexia and bulimia), body dysmorphia, and self-harm behaviors, with many cases focusing on harm experienced during adolescence when users were most developmentally vulnerable to addictive design features.
What design features are being targeted in the lawsuits?
Plaintiff attorneys have focused on specific features including infinite scrolling (which removes natural stopping cues), algorithmically amplified emotionally destabilizing content, face-altering filters linked to body image harm, notification systems calibrated to create anxiety and compulsive checking, and engagement-driven recommendation systems that prioritize time-on-platform over user wellbeing.
Are there lawsuits beyond individual cases?
Yes. Dozens of state attorneys general have filed suits against Meta and other platforms alleging violations of consumer protection, children's privacy, and public nuisance laws. Hundreds of school districts have also filed claims arguing that social media addiction has created a public health burden in their communities that the platforms should be required to address financially.
What happens next in the New Mexico case?
A second phase of the New Mexico trial is scheduled for May 2026. A judge will determine what operational and structural changes Meta must implement, which could include stronger age verification, improved predator removal systems, and restrictions on encrypted communications. Meta has said it will appeal the verdict.
How do I know if I or my child has a viable claim?
The strongest indicators include: social media use beginning in early adolescence; prolonged, compulsive use of platforms like Instagram or Facebook; a diagnosed mental health condition with documented medical treatment; a timeline connecting the onset or worsening of symptoms to platform use; and a claim filed within the applicable statute of limitations. Consulting an attorney is the best way to evaluate your specific situation.
Disclaimer
This content is for general informational purposes only, is not legal advice, and does not create an attorney-client relationship. Readers should consult a qualified attorney licensed in their jurisdiction.
If you believe you or your child may have a claim related to social media addiction, speaking with an attorney is the first step. Search for a personal injury attorney on AttorneyReview.com, or use our Get Matched feature to connect with a qualified lawyer who can evaluate your case.
Need a Personal Injury Attorney?
Get matched with pre-screened attorneys in your area. Free consultation, no obligation.
Get Matched Free