fb tracking

Social Media’s Big Tobacco Moment: Public Citizen Works to Hold Big Tech Accountable

By Nishita Nekkanti, Congress Watch Advocacy Intern

On March 25, 2026, a California jury found Google and Meta civilly liable for causing the depression and anxiety Kaley G.M. experienced because of compulsive social media use as a small child. The companies were held accountable for their role in “fueling a youth mental health crisis.” 

Google and Meta were compelled to pay $3 million in compensatory damages and an additional $3 million in punitive damages for developing addictive algorithms and product features – like infinite scroll and autoplay – that intentionally hook their users and fail to protect their youngest consumers. Meta was on the hook for 70% of that amount for Facebook and Instagram, while Google was responsible for the remaining 30% because of YouTube. Both companies have gone on record to emphasize that they disagree with the verdict and plan to appeal it to higher courts. 

There are currently thousands of consolidated cases against large Silicon Valley social media companies, and the verdict in this case has the potential to influence the outcomes for countless plaintiffs seeking justice. Many have compared the volume of litigation to the movement against Big Tobacco that played out in the courts in the 1990s and that ultimately resulted in stricter regulations surrounding advertising targeted towards minors and mandatory warnings on all cigarette packaging 

In fact, there is already a trend emerging of courts holding social media companies liable for the design of their products that contribute to the development of mental health conditions and enable illegal and predatory activities. This includes Meta being held liable in another case from New Mexico for “misleading consumers about the safety of its platforms.” Brought by New Mexico attorney general, Raúl Torrez, in 2023, the case followed an undercover operation in which a fake social media profile of a 13-year-old girl was created to determine the experience of young teenage users on the app. The profile was found to receive inappropriate images and targeted solicitations from known child abusers. The jury ultimately found Meta liable for $375 million in damages, determining $5,000 should be paid out for each violation after internal documents revealed that Meta employees and outside child safety experts had warned Meta executives of “the dangers present on Meta’s platforms.” 

Both cases represent how juries and judges are responding to the question of whether social media feeds should be treated as protected publishing spaces or products with designs that need to be evaluated for consumer safety, with courts erring on the side of classifying social media platforms like Instagram and TikTok as products. This new classification is vital to implementing industry standards that demand safety regulations to be taken into serious consideration. This is incredibly important because as of right now, many features used by Meta to increase engagement that cause harm to young users – i.e. infinite scroll and autoplay – have not been subject to safety reviews, and any safety protections in place were stripped through internal product development processes.  

As the lower courts begin to demand more from large media conglomerates, it will be up to the appellate courts to decide if the verdicts survive legal scrutiny. While these court outcomes will not immediately change the social media landscape, it is a first step in demanding stricter safety standards to prevent users, especially vulnerable youth, from being exploited and overexposed to unsafe conditions.  

And while the court process continues to crawl along, Public Citizen continues to fight against these Big Tech companies by publicly speaking out on the harmful impacts of social media addiction and supporting the drafting of legislation that imposes strict regulations on media companies to ensure child and user safety. For example, recently J.B. Branch from Public Citizen spoke at a press conference with Senator Hawley (R-Mo.), Senator Durbin (D-Ill.), and New Mexico Attorney General Torrez on the negative effects of social media use and how AI companies are now walking down a similar path by rushing unsafe products onto the market.

Without intentional and effective regulation, young users are being hooked onto these intentionally designed platforms that focus on user engagement at an influential time of their lives, and many are now struggling with mental health and eating disorders, radicalization, and more as a result. To counter the emerging threat from AI, Public Citizen is pushing for strong protections like chatbot legislation currently being written and introduced across the states. 

Now more than ever, it is important that the American people remain proactive in holding Big Tech companies accountable.