The Daily

Social Media on Trial

By The New York Times

Summary

Social media giants are facing an existential threat from a novel cluster of lawsuits that shift the legal focus from content moderation (which is protected by the First Amendment and Section 230) to product design and addiction. Unlike previous attempts to regulate these companies based on false content or competition, these new cases are framed as personal injury claims, arguing that the platforms' engineered features (like infinite scrolling, autoplay, and Snapstreaks) are intentionally addictive and directly cause mental health harms, including anxiety, depression, eating disorders, and suicidal ideation.

This litigation is being likened to the 'Big Tobacco moment' for social media. The plaintiffs, who include thousands of individuals, school districts, and state attorneys general, must prove a direct causal link between the platforms' design features and the alleged injuries—a difficult task given the multifactorial nature of mental health. A key piece of evidence for the plaintiffs comes from internal company documents, which allegedly show that companies like Meta knew about the toxic effects of features (such as beauty filters on Instagram) on young users, yet prioritized engagement and revenue over safety.

While some companies, like Snap and TikTok, have settled early bellwether cases (like the KGM case), others, notably Meta and YouTube, are determined to fight the claims in court. The defendants plan to rely heavily on existing legal shields, particularly Section 230 of the Communications Decency Act, and argue that mental health issues are caused by a variety of factors, not just social media. If the plaintiffs prevail, the outcome could force fundamental changes to the platforms' business models, demanding stronger age verification, parental controls, and the removal of lucrative addictive features.

Key Takeaways

1The 'Big Tobacco' Moment (Social media companies facing personal injury lawsuits like tobacco companies did)
2The Legal Shield (A physical shield labeled 'Section 230' being bypassed or cracked by new legal theories)
3The Addiction Trap (A user's finger caught in an infinite loop of scrolling on a phone screen)
4Internal Document Leak (A stack of papers labeled 'Discovery' revealing a hidden 'Toxic' warning sign)
5The Engagement Machine (A complex, gear-driven mechanism that converts addictive features into advertising revenue)

Full Transcript

Speakers:HostGuest
Host0:00

We all have moments when we could have done better. Like cutting your own hair. Yikes. Or forgetting sunscreen so now you look like a tomato. Ouch. Could have done better. Same goes for where you invest. Level up and invest smarter with Schwab. Get market insights, education, and human help when you need it.

Guest0:22

Learn [email protected] from the New York Times, I'm Rachel Abrams and this is the Daily. For years, social media companies have relied on an impenetrable First Amendment protection to shield them from legal claims that their products are dangerous to children. But now a new cluster of plaintiffs are trying a different tact. Today, my colleague Cecilia Kang explains why these lawsuits pose an existential threat to social media giants and how those companies are likely to defend themselves. It's Thursday, january 29th. Trouble for TikTok as a group of attorney generals in several states look into whether the video sharing platform TikTok is harmful for children. Internal research at Facebook found that its photo sharing app Instagram can harm the.

Host1:35

Mental health of millions of young users.

Guest1:37

Research shows 95% of teens are on social media. More than a third say they're on constantly.

Host1:43

For young people, the TikTok platform is like digital nicotine. One chart showed 21% of girls in the US felt somewhat worse or much worse after using Instagram. Social media taught me things about myself that I didn't even know, like how.

Guest1:59

I had an ugly nose or how.

Host2:01

My weight wasn't the proper weight.

Guest2:03

Social media said the solution to these.

Host2:05

Things wasn't self acceptance.

Guest2:07

Social media said the solution to these.

Host2:09

Things was products and sometimes even surgeries. Unregulated social media is a weapon of mass destruction that continues to jeopardize the safety, privacy and and well being of all American youth. It's time to act. As a dad of three, I'm angered and horrified. As an Attorney General, I, along with my colleagues across the country, are taking action to do something about it.

Guest2:43

Cecilia, welcome to the Daily.

Host2:45

Thanks for having me.

Guest2:46

So, Cecelia, we've talked a lot on this show about the claims that social media is harmful for children, that it can lead to mental health disorders, social isolation. And there have been all sorts of attempts over the years to really curb the reach and influence of these social media platforms. Now we have this new crop of lawsuits and I want to understand how are these lawsuits any different from previous attempts that we've seen to regulate or rein in these companies?

Host3:11

So these social media companies have for years faced really tough scrutiny and criticism for being too powerful and crushing competition for hosting content that is false. All kinds of harms related to the kind of content that is hosted on these platforms. But the cases that are about to begin this week in trials is really different in that there are thousands of individuals, school districts, and state attorneys generals that have come together in a series of lawsuits that are arguing the same way. One thing which is that social media is addictive, and that the addictive nature of these platforms have led to a bevy of personal injuries, including anxiety, depression, suicidal thoughts, eating disorders. So what's really different is this is less about the content they host, and this is more about the nature of the technologies. And this is a really novel legal theory. It's essentially social media's big tobacco moment, which led, as you know, to many years of litigation against the tobacco companies and ultimately led to the decline of smoking. And so many in social media see this as a really existential moment.

Guest4:34

So basically, the crux of this is that these are personal injury claims, right? And that effectively allows the plaintiffs to sidestep what has traditionally shielded these companies from liability, which is their free speech defense.

Host4:47

That's exactly right, Rachel. What the lawyers in these cases and the plaintiffs are trying to do is to get around that legal shield that the social media companies have been able to use to protect themselves in court. And they're saying, no, this is actually not about speech at all. This is about you companies creating and engineering technologies to be harmful and that those are violations of state and federal consumer laws.

Guest5:15

So let's walk through these cases. How are they making that claim specifically?

Host5:20

So this year we will see two big batches of trials begin in all of these cases that have been filed. And the first batch that takes place in Los Angeles include nine plaintiffs, nine trials, separate trials by these different plaintiffs. They're all individuals, all claiming that when they were young, when they were minors, they, they became addicted to social media and they suffered these harms. And these nine cases, they're known as bellwethers because they've been picked out of thousands of lawsuits filed by individuals against the social media companies. And they're seen as very representative of the many different charges and experiences that individuals have had and suffered, as they claim, by becoming addicted to these social media companies. So the first case and trial that begins is of a individual who goes by the initials KGM. She is a now 20 year old from Chico, California, and she has said that she created her first social media account on YouTube at the age of eight. She then joined Instagram at the age of nine and musical Ly which is now known as TikTok at the age of 10, and Snapchat at 11. So she's been using all the social media platforms for a long time. And her mom said that she had no idea that these platforms could be dangerous and could become so addictive to her child. And she only figured that out after watching a news program where she learned about the potential harms of social media. Her mom said that if she had known how potentially harmful these sites were, she would have prevented her daughter from perhaps even having a phone and using the apps. And what kgm, the plaintiff, is arguing is that the social media platforms were incredibly alluring to her and that she got hooked. And these very addictive products that use features like infinite scrolling, meaning it's just so easy to keep scrolling and scrolling. And things like autoplay videos, where right after you finish a video, the next one's queued up before you even think about it. And algorithms that direct you and recommend particular content that she has found to be very toxic. That all these features led her to overuse social media and become addicted. And that, in turn, led to lots of mental health problems, including anxiety, depression, suicidal thoughts, and body image issues for her.

Guest8:10

Mm. So these are the kinds of claims that I think a lot of people have become familiar with by now. The idea that young people can develop any number of mental and emotional conditions from repeated exposure to social media platforms. What is some of the other litigation that you're watching?

Host8:30

So the next big wave begins around June in federal court. They're all bundled together, and they're brought by attorneys general in dozens of states as well as school districts. And those are really interesting, Rachel, in that they're charging the companies with being public nuisances. That the fact that they, as school districts and states have had to shoulder the costs of mental health services, phone programs within schools, all kinds of programs to deal with a youth crisis. And so they are suing the companies for monetary damages. And they're also saying that they would like to see big changes within the companies, that the platforms have to give up some of these addictive technology features.

Guest9:19

Given that these are all personal injury claims, what do the plaintiffs actually need to prove in order to prevail in court?

Host9:26

What these plaintiffs have to prove is that social media is linked to addiction. And that's going to be hard. It's going to be a new sort of argument that hasn't been tested before. And so they're going to have to show that there is expert evidence that the use of tools like infinite scrolling on TikTok and on Instagram and Autoplay of video are features that have led to compulsive use, and that there is a direct link between the technology and behavior. And they'll also have to show that these companies knew all along that their products were harmful and that they withheld what they knew from the public.

Guest10:20

So what's the best evidence that the plaintiffs have to show what you're describing as a causal link between the technology and the harm?

Host10:30

So there have been numerous studies done on the mental health effects of social media. But what the plaintiffs are going to really rely on is hundreds of thousands of documents that they've collected in discovery ahead of these trials that the plaintiff's lawyers say show that the companies knew that there was a problem, and they found internally that there was a lot of troubling evidence about their products and how they affected young people. For example, in 2018, Meta began studying how beauty filters on Instagram.

Guest11:04

Beauty filters, just to be clear, those are the filter you can put on your face or somebody else's face to make them more beautiful, to just alter the image, right?

Host11:13

Yes. And they began studying that in 2018 and decided in 2019, after a lot of backlash publicly, that they would ban the filter. But that same year, in 2019, Mark Zuckerberg, the CEO, considered bringing the filters back to Instagram. These were big drivers of engagement, and young people liked to use them. And employees within the company implored him not to, including an executive, because she said, they're really just so toxic for particularly young girls. And she said that her own daughter suffered from body dysmorphia. And. And she sent an email directly to Zuckerberg asking him to reconsider. He ignored the email and decided in 2020 to reinstate the beauty filters. And so lawyers for KGM are going to point to these internal documents and say that this is really the proof that the company not only studied the problem, they recognized there was a problem, and yet they did not tell the public about the problem. They allowed the tools to continue operating.

Guest12:21

And what are the plaintiffs asking for specifically? Obviously, money. But can you just give us a little bit more specifics on their demands?

Host12:28

The plaintiffs are asking, as you said, for monetary damages, and they are also asking for changes to the designs of these platforms. So they're going to ask for stronger age verification and tools to make sure that underage users are no longer able to escape the terms and service and use the platforms. They'll probably also ask for more parental controls and that the companies remove Addictive features like Infinite scroll and Autoplay of videos and Snapstreaks.

Guest13:01

I'm really going to show my age here, Cecilia. But what is a snapstreak?

Host13:06

So a snapstreak is. It's kind of a game, and this is why it's been accused of being addictive. It's messaging between two people. And the idea is to create a streak of messages between two people. People. And you maintain a streak by communicating every day and sending snaps, which are usually visuals, like a photo or some sort of a video or some sort of a message. And you keep your streak going if you communicate every day. You lose your streak if you stop even for one day.

Guest13:38

I see. And that does seem very clearly like an example of a tool that is designed to keep you on the platform as much as possible, which is part of the business model. Right. That's what these companies are trying to do with their users. So it makes sense that if you take those features away, that could pose, as you said, kind of an existential threat to the entire business model.

Host13:56

That's right. And it's important to keep in mind that the business model is advertising. And what really fuels advertising revenue is engagement.

Guest14:05

Right.

Host14:06

Engagement is at the heart of this. And these tools are meant to keep people more engaged. So you can see why these trials are really so potentially damaging for these companies. And so that's why we've seen two companies, Snap and TikTok, settle the very first case with KGM. We don't know the terms of those settlements, But Meta and YouTube are still scheduled to go to trial as defendants in KGM's lawsuit and appear very determined to continue to take this to trial.

Guest14:56

We'll be right back.

Host15:00

We all have moments when we could have done better. Like cutting your own hair. Yikes. Or forgetting sunscreen. So now you look like a tomato. Ouch. Coulda done better. Same goes for where you invest. Level up and invest smarter with Schwab. Get market insights, education, and human help when you need it. Learn [email protected] Cecilia if these lawsuits are.

Guest15:27

So existential, potentially for some of these social media companies, why would some of them not settle the way that TikTok and Snap did with that first case? Presumably, the money that they would have to pay to settle is nothing compared with having to alter an entire business model. Right. So why even take the risk and go to trial?

Host15:45

Well, there are many trials that are scheduled, first of all. So even though two companies were able to settle with KGM in this first case, there are numerous more in the state court as well as in federal court going forward. The other thing to keep in mind is that the companies, especially Meta and YouTube, really feel strongly that they have a good case on their side and they will bring up speech protections like you mentioned, Rachel. They're going to say that there is a law known as section 230 of the Communications Decency act that shields Internet companies from the content they Host because section 230 has been so broad and so strongly used in their favor in so many different instances. And so they're feeling pretty confident that they can rely on that legal shield once again. In addition, they reject the idea that social media can be linked to personal injury. And the company's lawyers are expected to argue that there are many factors that go into mental health issues. They're going to say that it's multifactorial. It could be school problems, stress with friends. There could be all kinds of factors that lead to anxiety, depression, and other mental health disorders and not social media alone.

Guest17:03

Right. And the causal link does, in fairness, feel like something worth grappling with. Right. Because how do you distinguish the impact, for example, of social media from a culture that promotes certain beauty standards and certain body types? Right. Like, is it actually possible to isolate and prove causation back to a specific social media platform?

Host17:23

What the plaintiff's lawyers are going to try to do is to again, draw from all the internal documents they've collected, and they will try to show how the push to increase engagement and to make their products sticky and even addictive. But ultimately it comes down to a jury in these California cases, juries will decide the subsequent cases as well. And that might be favorable for the plaintiffs because everyone has a story about social media. We know, for example, that the majority of American parents see social media as a problem, and yet the companies have so far escaped scrutiny.

Guest18:04

Cecilia if this does end up being social media's big tobacco moment and they lose these cases in court and a jury decides that this is in fact an addictive product, that means that we have an entire generation of kids who are now addicted. And so I wonder, we've been talking this whole conversation a lot about what happens to the social media companies, but what happens to these children that have essentially been the guinea pigs for this massive social experiment?

Host18:33

Remember, decades ago, when the trials began against Big Tobacco, it seemed crazy and really far fetched to accuse the companies of creating an addictive and harmful product. But they did. And with social media, with all of these young people who have been blamed for years for being unable to regulate their use of these social media apps, the conversation might change. The blame could lie in a different place with the social media companies now. That won't take back the experiences of so many young people who say they've been harmed by these social media platforms, but it could profoundly change the conversation in our society.

Guest19:20

Cecilia Kang, thank you so much for your time.

Host19:23

Thanks for having me, Rachel.

Guest19:33

We'll be right back.

Host19:40

We all have moments when we could have done better. Like cutting your own hair. Yikes. Or forgetting sunscreen so now you look like a tomato. Ouch. Coulda done better. Same goes for where you invest. Level up and invest smarter with Schwab. Get market insights, education and human help when you need it. Learn [email protected].

Guest20:07

Here'S what else you need to know today. On Wednesday, the Federal Reserve voted to keep interest rates at their current levels, despite enormous pressure from President Trump to cut rates. Two Fed governors, both appointed by President Trump, cast dissenting votes. But Fed Chairman Jerome Powell continues to reject Trump's demands for a rate cut, even after the administration opened an unusual criminal investigation this month into Powell's conduct. And our founders debated extensively over which branch of government should have the power.

Host20:41

To declare or initiate war. Virtually unanimously, they decided what was entered into the Constitution was that the declaration or initiation of war would be the power of Congress.

Guest20:52

In a series of pointed exchanges on Wednesday, senators of both parties, including Republican Rand Paul of Kentucky, pressed Secretary of State Marco Rubio to explain why neither he nor President Trump consulted with Congress before sending US Troops into Venezuela to arrest and remove the country's president.

Host21:11

So I would ask you, if a.

Guest21:12

Foreign country bombed our air defense missiles, captured and removed our president, and blockaded.

Host21:18

Our country, would that be considered an act of war? Would it be an act of war? We just don't believe that this operation comes anywhere close to the constitutional definition of but would it be an act of war if someone did it to us?

Guest21:30

Of course it would be an act of war. During the hearing, Rubio refused to rule out future US Military action in Venezuela, but said that President Trump has no desire to send American troops back to the country. Today's episode was produced by Rochelle Bonga and Shannon Lynn. It was edited by Lexi Dio and Michael Benoit, contains music by Rowan Ni Misto and Dan Powell, and was engineered by Chris Wood. That's it for the Daily I'm Rachel Abrams. See you tomorrow.

Host22:24

We all have moments when we could have done better. Like cutting your own hair. Yikes. Or forgetting sunscreen. So now you look like a tomato. Ouch. Coulda done better. Same goes for where you invest. Level up and invest smarter with Schwab. Get market insights, education and human help when you need it. Learn [email protected].

Generated by Podcast TLDR · January 30, 2026