(Bloomberg) — Meta Platforms Inc. and Snap Inc. are to blame for the suicide of an 11-year-old who suffered from depression and sleep deprivation after becoming addicted to Instagram and Snapchat, the girl’s mother alleged in a lawsuit.
The woman claims her daughter Selena Rodriguez struggled for two years with an “extreme addiction” to Meta’s photo-sharing platform and Snap’s messaging app before taking her life last year. Instagram and Snapchat have deliberately designed algorithms that keep teens hooked onto their platforms and “promote problematic and excessive use that they know is indicative of addictive and self-destructive use,” according to the lawsuit in San Francisco federal court.
The complaint appears to be the first of its kind against Meta, formerly known as Facebook Inc., said attorney Matthew Bergman, who founded Social Media Victims Law Center in Seattle and represents Rodriguez’s mother.
“There is a mental health epidemic among American teens,” Bergman said. He added that he anticipates a significant number of similar cases will be filed after a former Facebook employee turned whistle-blower testified in Congress in October that the company knew about, but didn’t disclose, the harmful impacts of services like Instagram.
In November, a group of U.S. state attorneys general announced an investigation of Instagram over its efforts to draw children and young adults, taking aim at the risks the social network may pose to their mental health and well-being.
The backlash against social media isn’t limited to the U.S. The father of a 14-year-old in the U.K. touched off a firestorm when he blamed her 2017 suicide partly on Instagram. The company told the BBC that it doesn’t allow content that promotes self-harm.
Read More: Instagram Probed by States Over Efforts to Lure Young People
“We are devastated to hear of Selena’s passing and our hearts go out to her family,” a Snap spokesperson said Friday in an emailed statement. “While we can’t comment on the specifics of active litigation, nothing is more important to us than the wellbeing of our community.”
Meta and Snap knew or should have known that “their social media products were harmful to a significant percentage of their minor users,” according to Thursday’s lawsuit.
Meta representatives didn’t respond to an email seeking comment.
A Meta spokesperson said in November that allegations the company puts profit over safety are false and that “we continue to build new features to help people who might be dealing with negative social comparisons or body image issues.”
Snap said in May it was suspending projects with two app makers “out of an abundance of caution for the safety of the Snapchat community” in light of a wrongful-death and class-action suit filed in California that accused the companies of failing to enforce their own policies against cyber-bullying. That case was brought by the mother of a 16-year-old boy who killed himself in 2020.
Read More: Snap, Amazon Appellate Rulings Pave New Paths to Tech Liability
Tammy Rodriguez, who lives in Connecticut, said when she tried to limit her daughter’s access to the platforms, the girl ran away from home. She took her daughter to a therapist who said “she had never seen a patient as addicted to social media as Selena,” according to the suit.
The lawsuit levels its harshest criticism at Snapchat, saying the platform rewards users in “excessive and dangerous ways” for engagement. The mother alleges claims of product defect, negligence and violations of California’s consumer protection law.
In a separate complaint that Bergman filed Thursday, a mother in Oregon blames Meta and Snap for her 15-year-old daughter developing “numerous mental health conditions including multiple inpatient psychiatric admissions, an eating disorder, self-harm, and physically and mentally abusive behaviors toward her mother and siblings.”
Social media companies have been largely successful fending off lawsuits blaming them for personal injuries thanks to a 1996 federal law that shields internet platforms from liability for what users post online.
“Snapchat helps people communicate with their real friends, without some of the public pressure and social comparison features of traditional social media platforms, and intentionally makes it hard for strangers to contact young people,” the Snap spokesperson said. “We work closely with many mental health organizations to provide in-app tools and resources for Snapchatters as part of our ongoing work to keep our community safe.”
The case is Rodriguez v. Meta Platforms Inc. f/k/a Facebook Inc. 3:22-cv-00401, U.S. District Court, Northern District of California (San Francisco).
(Updates with comment by plaintiff’s attorney)
More stories like this are available on bloomberg.com
©2022 Bloomberg L.P.