From the New York Times, I'm Rachel Abrams, and this is The Daily. For years, social media companies have relied on an impenetrable First Amendment protection to shield them from legal claims that their products are dangerous to children. But now, a new cluster of plaintiffs are trying a different tact. Today, my colleague, Cecilia Kong, explains why these lawsuits pose an existential threat to social media giants and how those companies are likely to defend themselves. It's Thursday, January 29.
Trouble for TikTok.
As a group of attorney generals in several United States look into whether the video sharing platform TikTok is harmful for children.
Internal research at Facebook found that its photo sharing app, Instagram, can harm the mental health of millions of young users. Research shows 95% of teens are on social media.
More than a third say they're on constantly. For young people, the TikTok platform is like digital nicotine. One chart showed 21% of girls in the US felt somewhat worse or much worse after using Instagram.
Social media taught me things about myself that I didn't even know, like how I had an ugly nose or how my weight wasn't the proper weight. Social media said the solution to these things wasn't self-acceptance. Social media said the solution to these things was products and sometimes even surgeries.
Unregulated social media is a weapon of mass destruction that continues to jeopardize the safety, privacy, and well-being of all American youth. It's time to act. As a dad of three, I'm angered and horrified. As an attorney general, I, along with my colleagues across the country, are taking action to do something about it.
Cecilia, welcome to The Daily.
Thanks for having me.
So, Cecilia, we've talked a lot on this show about the claims that social media is harmful for children, that it can lead to mental health disorders, social isolation. There have been all sorts of attempts over the years to really curb the reach and influence of these social media platforms. Now, we have this new crop of lawsuits, and I want to understand, how are these lawsuits any different from previous attempts that we've seen to regulate or to rein in these companies?
These social media companies have, for years, faced really tough scrutiny and criticism for being too powerful and crushing competition, for hosting content that is false, all kinds of harms related to the content that is hosted on these platforms. But the cases that are about to be in this week in trials is really different in that there are thousands of individuals, school districts, and state attorneys' generals that have come together in a series of lawsuits that are arguing the same one thing, which is that social media is addictive, and that the addictive and that the addictive nature of these platforms have led to a bevy of personal injuries, including anxiety, depression, suicidal thoughts, eating disorders. What's really different is this is less about the content they host, and this is more about the nature of the technologies. This is a really novel legal theory. It's essentially social media's big tobacco moment, which led, as you know, to many years of litigation against the tobacco companies and ultimately led to the decline of smoking. So many in social media see this as a really existential moment.
Basically, the crux of this is that these are personal injury claims, right? That effectively allows the plaintiffs to sidestep what has traditionally shielded these companies from liability, which is their free speech defense.
That's exactly right, Rachel. What the lawyers in these cases and the plaintiffs are trying to do is to get around that legal shield that the social media companies have been able to use to protect themselves in court. They're saying, No, this is actually not about speech at all. This is about you companies creating and engineering technologies to be harmful, and that those are violations of state and federal consumer laws.
Let's walk through these cases. How are they making that claim specifically?
This year, we will see two big batches of trials begin in all of these cases that have been filed. The first batch that takes place in Los Angeles include nine plaintiffs, nine trials, separate trials by these different plaintiffs. They're all individuals all claiming that when they were young, when they were minors, they became addicted to social media and they suffered these harms. These nine cases, they're known as Bellwethers because they've been picked out of thousands of lawsuits filed by individuals against the social media companies. They're seen as very representative of the many different charges and experiences that individuals have had and suffered, as they claim, by becoming addicted to these social media companies. So the first case in trial that begins is of a individual who goes by the initials KGM. She is a now 20-year-old from Chico, California, and she has said that she created her first social media account on YouTube at the age of eight. She then joined Instagram at the age of nine and Musically, which is now known as TikTok, at the age of 10, and Snapchat at 11. She's been using all the social media platforms for a long time.
Her mom said that she had no idea that these platforms could be dangerous and could become so addictive to her child. She only figured that out after watching a news program where she learned about the potential harms of social media. Her mom said that if she had known how potentially harmful these sites were, she would have prevented her daughter from perhaps even having a phone and using the apps. What KGM, the plaintiff, is arguing is that the social media platform platforms were incredibly alluring to her and that she got hooked. And these very addictive products that use features like infinite scrolling, meaning it's just so easy to keep scrolling and scrolling, and things like autoplay videos where right after you finish a video, the next one's cued up before you even think about it. And algorithms that direct you and recommend particular content that she has found to be very toxic, that all these features led her to overuse social media and become addicted. And that in turn led to lots of mental health problems, including anxiety, depression, suicidal thoughts, and body image issues for her.
These are the kinds of claims that I think a lot of people have become familiar with by now, the idea that young people can develop any number of mental and emotional conditions from repeated exposure to social media platforms. What is some of the other litigation that you're watching?
So the next big wave begins around June in federal court. They're all bundled together, and they're brought by attorneys general in dozens of states, as well as school districts. Those are really interesting, Rachel, in that they are charging the companies with being a public nuisance. That the fact that they, as school districts and states, have had to shoulder the costs of mental health services, phone programs within schools, all kinds of programs to deal with a youth crisis. They are suing the companies for monetary damages. They're also saying that they would like to see big changes within the companies, that the platforms have to give up some of these addictive technology features.
Given that these are all personal injury claims, what do the plaintiffs actually need to prove in order to prevail in court?
What these plaintiffs have to prove is that social media is linked to addiction. That's going to be hard. It's going to be a new argument that hasn't been tested before. They're going to have to show that there is expert evidence that the use of tools like infinite scrolling on TikTok and on Instagram and autoplay a video are features that have led to compulsive use and that there is a direct link between the technology and behavior. They'll also have to show that these companies knew all along that their products were harmful and that they withheld what they knew from the public.
What's the best evidence that the plaintiffs have to show what you're describing as a causal link between the technology and the harm?
There have been numerous studies done on the mental health effects of social media. But what the plaintiffs are going to really rely on is hundreds of thousands of documents that they've collected and discovered ahead of these trials that the plaintiff's lawyer say show that the companies knew that there was a problem, and they found internally that there was a lot of troubling evidence about their products and how they affected young people. For example, in 2018, Metta began studying how beauty filters on Instagram.
Beauty filters, just to be clear, those are the filter you can put on your face or somebody else's face to make them more beautiful, to just alter the image, right?
Yes. They began studying that in 2018 and decided in 2019, after a lot of backlash publicly, that they would ban the filter. But that The same year in 2019, Mark Zuckerberg, the CEO, considered bringing the filters back to Instagram. These were big drivers of engagement, and young people like to use them. Employees within the company implored him not to, including an executive, because she said they were really just so toxic for particularly young girls. She said that her own daughter suffered from body dysmorphia. She sent an email directly to Zuckerberg asking him to reconsider. He ignored the email and decided in 2020 to reinstate the beauty filters. And so lawyers for KGM are going to point to these internal documents and say that this is really the proof that the company not only studied the problem, they recognized there was a problem, and yet they did not tell the public about the problem. They allowed the tools to continue operating.
What are the plaintiffs asking for, specifically? Obviously, money, but can you just give us a little bit more specifics on their demands?
The plaintiffs are asking, as you said, for monetary damages, and they are also asking for changes to the designs of these platforms. They're going to ask for stronger age verification and tools to make sure that underage users are no longer able to escape the terms and service and use the platforms. They'll probably also ask for more parental controls and that the companies remove addictive features like infinite scroll and autoplay of videos and snapstreaks.
I'm really going to show my age here, Cecilia, but what is a snapstreek?
A snapstreak is... It's a game, and this is why it's been accused of being addictive. It's messaging between two people. The idea is to create a streak of messages between two people. You maintain a streak by communicating every day and sending snaps, which are usually visuals, like a photo or some a video or some a message. You keep your streak going if you communicate every day. You lose your streak if you stop even for one day. I see.
That does seem very clearly like an example of a tool that is designed to keep you on the platform as much as possible, which is part of the business model, right? That's what these companies are trying to do with their users. It makes sense that if you take those features away, that could pose, as you said, an existential threat to the entire business model.
That's right. It's important to keep in mind that the business model is advertising. What really fuels advertising revenue is engagement. Engagement is at the heart of this, and these tools are meant to keep people more engaged. You can see why these trials are really so potentially damaging for these companies. That's why we've seen two companies, Snap and TikTok, settle the very first case with KGM. We don't know the terms of those settlements, but Metta and YouTube are still scheduled to go to trial as defendants in KGM's lawsuit and appear very determined to continue to take this to trial.
We'll be right back. Cecilia, if these lawsuits are so existential, potentially, for some of these social media companies, why would some of them not settle the way that TikTok and Snap did with that first case? Presumably, the money that they would have to pay to settle is nothing compared with having to alter an entire business model, right? Why even take the risk and go to trial?
Well, there are many trials that are scheduled, first of all. Even though two companies were able to settle with KGM in this first case, there are numerous more in the state court as well as in federal court going forward. The other thing to keep in mind is that the companies, especially Metta and YouTube, really feel strongly that they have a good case on their side, and they will bring up speech protections. Like you mentioned, Rachel, they're going to say that there is a law known as Section 230 of the Communications Decency Act that shields internet companies from the content they host. Because Section 230 has been so broad and so strongly used in their favor in so many different instances, and so they're feeling pretty confident that they can rely on that legal shield once again. In addition, they reject the idea that social media can be linked to personal injury. The company's lawyers are expected to argue that there are many factors that go into mental health issues. They're going to say that it's multifactorial. It could be school problems, stress with friends. There could be all kinds of factors that lead to anxiety, depression, and other mental health disorders, and not social media alone.
The causal link does, in fairness, feel like something worth grappling with, right? Because how do you distinguish the impact, for example, of social media from a culture that promotes certain beauty standards and certain body types, right? Is it actually possible to isolate and prove causation back to a specific social media platform?
What the plaintiffs' lawyers are going to try to do is to, again, draw from all the internal documents they've collected, and they will try to show how the push to increase engagement and to make their products sticky and even addictive. But ultimately, it comes down to a jury in these California cases. Juries will decide the subsequent cases as well. That might be favorable for the plaintiffs because everyone has a story about social media. We know, for example, that the majority of American parents see social media as a problem, and yet the companies have so far escaped scrutiny.
Cecilia, if this does end up being social media's big tobacco moment, and they lose these cases in court, and a jury decides that this is, in fact, an addictive product, that means that we have an entire generation of kids who are now addicted. I I wonder, we've been talking this whole conversation a lot about what happens to the social media companies, but what happens to these children that have essentially been the guinea pigs for this massive social experiment?
Remember decades ago when the trials began against big tobacco, it seemed crazy and really far-fetched to accuse the companies of creating an addictive and harmful product. But they did. With social media, with all of these young people who have been blamed for years for being unable to regulate their use of these social media apps, the conversation might change. The blame could lie in a different place with the social media companies. Now, that won't take back the experiences of so many young people who say they've been harmed by these social media platforms, but it could profoundly change the conversation in our society.
Cecilia Kong, thank you so much for your time.
Thanks for having me, Rachel.
We'll be right back. Here's what else you need to know today. On Wednesday, the Federal Reserve voted to keep interest rates at their current levels despite enormous pressure from President Trump to cut rates. Two Fed governors, both appointed by President Trump, cast dissenting votes. But Fed chairman Jerome Powell continues to reject Trump's demands for a rate cut, even after the administration opened an unusual criminal investigation this month into Powell's conduct. And...
Our founders debated extensively over which branch of government should have the power to declare or initiate war. Virtually, unanimously, they he decided, and what was entered into the Constitution was that the declaration or initiation of war would be the power of Congress.
In a series of pointed exchanges on Wednesday, senators of both parties, including Republican Rand Paul of Kentucky, pressed Secretary of State, Marco Rubio, to explain why neither he nor President Trump consulted with Congress before sending US troops into Venezuela to arrest and remove the country's President.
So I would ask you if a foreign country bombed our air defense missiles, captured and removed our President, and blockaded our country, would that be considered an act of war? Would it be an act of war?
We just don't believe that this operation comes anywhere close to the constitutional definition of war.
But would it be an act of war if someone did it to us? Of course, it would be an act of war.
During the hearing, Rubio refused to rule up future US military action in Venezuela, but said that President Trump has no desire to send American troops back to the country.
Today's episode was produced by Rochelle Bonja and Shannon Lynn.
It was edited by Lexie Diao and Michael Benoît, contains music by Ron DeMisto and Dan Powell, and was engineered by Chris Wood. That's it for The Daily. I'm Rachel Abrams. See you tomorrow.
For years, social media companies have relied on an impenetrable first amendment protection to shield them from legal claims that their products are dangerous to children.But now, a cluster of plaintiffs are trying a different tact.Cecilia Kang, who covers technology, explains why these new lawsuits pose an existential threat to social media giants, and how those companies are likely to defend themselves.Guest: Cecilia Kang, a reporter covering technology and regulatory policy for The New York Times.Background reading: Here’s what to know about the social media addiction trials.TikTok reached an agreement to settle a lawsuit, avoiding the first in a series of landmark trials.Photo: David Gray/Agence France-Presse — Getty ImagesFor more information on today’s episode, visit nytimes.com/thedaily. Transcripts of each episode will be made available by the next workday.
Subscribe today at nytimes.com/podcasts or on Apple Podcasts and Spotify. You can also subscribe via your favorite podcast app here https://www.nytimes.com/activate-access/audio?source=podcatcher. For more podcasts and narrated articles, download The New York Times app at nytimes.com/app.