


Instagram’s and Facebook’s features poisoned a generation of online children, more than 40 states claimed in lawsuits filed against the platforms’ parent company, Meta, on Tuesday.
The lawsuits allege that Meta knowingly fueled the youth mental health crisis with an algorithm that encourages kids to scroll mindlessly.
“Meta’s design strategy exploits [children’s] vulnerabilities: from a dopamine-inducing personalization algorithm that gives kids the same feeling as gambling, to consistent alerts that interfere with their schoolwork and sleep. Meta content capitalizes on children’s fear of missing out and urges them to constantly engage with the Platforms,” New Hampshire attorney general John M. Formella said. “Meta’s decision to do so despite its knowledge of significant links between excessive use of social media and increased instances of serious health problems such as depression, anxiety, and insomnia is unacceptable and unlawful. Not unlike Big Tobacco a generation ago, Meta has chosen profits over public health, particularly the health of the youngest among us.”
Thirty-three states filed the federal lawsuit in a California court on Tuesday, and other state attorney generals filed similar suits in state courts. The lawsuits seek to change app features to make Meta’s platforms safer for young users. The tech company, which has previously denied responsibility for its platforms causing harm to young users, announced its “disappointment” at the lawsuit.
“Instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path,” the company said.
A Meta whistleblower released documents in 2021 that provided evidence that the company knew Instagram to be addictive and harmful to youth mental health. Since then, the media company has faced many lawsuits filed by school districts and states. Hundreds of school districts are currently suing Meta over concerns that the platforms encourage cyberbullying and distract students.
“It is very clear that decisions made by social media platforms, like Meta, are part of what is driving mental health harms, physical health harms, and threats that we can’t ignore,” Colorado attorney general Phil Weiser said.
To defend itself from the lawsuits, Meta will likely defer to a Section 230 law passed in 1996 that protects media companies from being liable for third-party platform content. The apps’ “alleged defects,” Meta lawyers said in July, “are inescapably linked to the publication of third-party content.” Because recent state lawsuits claim that Meta violates child safety laws, and do not take issue with specific pieces of published content, state lawmakers hope that a Section 230 defense won’t stand up to scrutiny.
“I think that it’s a very close call as to whether Meta would succeed with a Section 230 defense,” Jeff Kosseff, an author and law professor, told NPR. “Courts are increasingly willing to conclude that Section 230 is not a defense in lawsuits arising from claims about product design, though the line is not always clear.”