Skip to main content

You make this possible. Support our independent, nonprofit newsroom today.

Give Now

Seattle, Kent schools' lawsuits against social media giants have a 'scientific basis,' expert says

neon sign social media heart generic
Enlarge Icon

Seattle Public Schools and the Kent School District are suing social media companies for the role their platforms have played in a youth mental health crisis. The lawsuits claim Facebook, Instagram, Snapchat, TikTok and YouTube have violated Washington's public nuisance law and exploited students.

The districts seem to be the first in the nation to take on such a case.

Dr. Dimitri Christakis, director of the Center for Child Health Behavior and Development at Seattle Children's Research Institute, says the scientific basis of the complaints is sound.

"I can't speak to the legal grounds for the suit," Christakis says. "But many of us who have been researching the effects of media on children have been concerned for a long time about the negative effects on children's emotional and mental health, as well as other developmental outcomes."

RELATED: Washington superintendent has an eye on Seattle schools' social media lawsuit

Like the school districts, Christakis argues the companies have known the harm they cause but have chosen profits over public health.

Now, the districts want compensation to cover the costs of addressing mental health in schools.

Christakis says that makes sense when you consider the role schools play in addressing students' mental health needs.

"If you look at spikes in emergency department visits, for example, for mental health crises, there's a very clear, seasonal distribution with spikes in the fall and spikes in the early winter in January," he says. "Not coincidentally, that represents a return to school from summer vacation and a return to school from winter break.

"It's perfectly reasonable to suggest that [schools] need additional resources, not just to treat these children but to identify them early and perhaps even to put preventive strategies into place."

The social media platforms and their parents companies have broadly argued they have safety measures in place for youth using their apps.

Christakis acknowledges the districts' lawsuits only get at part of the problem: after the alleged harm has been done to students.

He says families should try to be as aware of how their kids spend their time in the virtual world as they are about how their time is spent in the real world. But even that approach has limitations.

RELATED: Could the U.S. ban TikTok for everyone?

"It's very difficult for a parent to know what their child is looking at online," Christakis says. "But Instagram knows it, Facebook knows it, TikTok knows it. And instead of alerting parents or caregivers or teachers or pediatricians that, hey, this child is looking at a lot of suicide videos, they just feed that child more of the same content."

Still, the school districts will have a big obstacle to clear in their lawsuits against major social media networks.

A federal provision of the Communications Decency Act currently protects internet companies from legal liability related to what users post on their sites.

"The danger would be trying to attribute to Facebook the bad content that people post there," says Ryan Calo, a law professor at the University of Washington who specializes in technology law. "That's going to be a significant hurdle, even to a nuisance lawsuit, because this is a federal law."

Calo says the school district is going to have to prove that the nuisance created by social media companies violates that federal law.

However, a separate case currently in front of the United States Supreme Court could work to favor the school districts.

RELATED: Could now be the time to consider a post-social media future?

It involves whether internet companies can match users with harmful content, including recruitment ads from extremist organizations, and argues that recommending extremist videos to user violates anti-terrorism laws.

"The basic theory is the companies shouldn't get immunity for that process of matching vulnerable people with highly problematic content," Calo says. "And who knows what the Supreme Court will do."

That case will go before the high court next month.

Why you can trust KUOW