Long Island family sues Meta for ‘harming’ daughter through Instagram use
A family from Long Island is suing Meta for “harming” their teenage daughter through her use of Instagram.
Alexis Spence, now 20, of Yaphank, started using Instagram on her phone when she was 11 years old.
Spence said she created an account so she could play with Webkinz. Webkinz are stuffed animals that have online counterparts in games.
“It was cute and innocent,” Spence told reporters.
Spence said she started clicking on fitness pictures and pictures of models. She said Instagram began, “showing me pictures of young people struggling with eating disorders and their bodies.”
She said Instagram suggested she follow users like “skin_and_bones” and “apple core anorexic.”
Spence said she was able to join chat groups in which people would have to log their calories, weigh-in everyday and take pictures of their weight on a scale.
“If you went over, everyone in the group would, like, bully you,” she said. “It was awful.”
Spence said some of the chats happened within the Instagram platform or within separate apps recommended by Instagram users.
Spence said she did what many young people do — made a separate Instagram account using a generic email address, so she could keep track of the posts and pictures she did not want her parents to see. She even hid the Instagram app within the calculator app in the iPhone.
“My mental health was really, really struggling,” Spence said.
Spence’s parents said they put all the parental controls and protections on Alexis’ phone, but Alexis found a way around them. When they hid the phone from her, she would wait until they were sleeping and she would find it.
“As much as I would do my research on how to protect her, she was 10 steps ahead of me on how to evade us,” Kathleen Spence, Spence’s mother, said. “We did everything that we should have done as parents, but we were fighting a computer algorithm. We were outnumbered, we were out powered.”
By 15, Spence said she had several eating disorders and was suffering from mental health issues.
“My parents were like, we’re taking the phone, you need to go to the hospital,” Spence recalled.
Spence spent about two weeks in a psychiatric ward.
“After 14 days of having no device, when I was allowed to see my daughter for the first time, she was a completely different person,” recounted Jeffrey Spence, Spence’s father.
Spence’s parents said they were always suspicious of whether Spence’s struggles stemmed from her social media use. But, they said when Facebook whistleblower, Frances Haugen, came forward in 2021 with evidence Facebook knew Instagram was toxic for teenage girls, they put two-and-two together.
“Social media is the silent killer of our children’s generation,” Kathleen Spence said.
The Spences are now part of more than 1,000 families across the United States and Canada who are suing the social media giants for causing personal injuries to their loved ones.
The families are represented by the Social Media Victims Law Center based in Seattle.
“Parents are saying enough is enough,” Matthew Bergman, founder of the law center, said. “These products have to be held accountable.”
Bergman said he knows the lawsuits face a tough uphill battle.
“Currently, these social media platforms operate in this ether land where they don’t have any of the responsibility that any other responsible company in America does,” he said. “All we’re saying that social media companies should operate under the same rules.”
Last week, the Seattle Public School District announced it is filing a lawsuit against the social media giants, including Meta, for “creating a youth mental health crisis.”
“Until these companies have to bear the financial consequences of their unsafe product design, they’re unlikely to want to change their behavior,” Bergman said. “Currently, the costs of these dangerous products is as you saw in Alexis’ case are not being born by the platform – they’re being born by Alexis’ parents, insurance companies, clergy, teachers, everybody other than the platform.”
Antigone Davis, Global Head of Safety at Meta, told reporters Meta has removed nearly all Instagram hashtags and users that promote self-harm, suicide or eating disorders.
“We don’t allow content that promotes suicide, self-harm or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us. We’ll continue to work closely with experts, policymakers and parents on these important issues,” he said.
Meta said it has developed various tools to help teens stay safe on Instagram, including age verification done through uploading a video selfie, nudging teens to look at a different topic if they’ve been scrolling on one topic for too long and letting parents see how long their child has been on Instagram and who they’re following.
The company also said when anyone under 18 signs up for Instagram, the platform defaults into the most restrictive settings.
Meta provided instructional videos to assist parents and teens in staying safe on Instagram:
Spence said she no longer uses Instagram and she sets time limits for herself on other social media platforms.
Because she suffers from anxiety, Spence has a therapy dog who assists her. He’s trained to interrupt behaviors like when Spence taps her arm or leg.
“I’m not allowed to go into the bathroom by myself, so if I go into the bathroom, he scratches the door and barks,” she said.
Spence said she hopes to be able to speak in front of Congress about the dangers of social media.
“When are our politicians going to stand up?” Kathleen Spence asked.
On Thursday, Instagram announced new features to help people – especially teens – manage their nighttime use on the platform. Users can put their accounts in “Quiet Mode.” Once enabled, Quiet Mode silences notifications and lets people know you’re unavailable. While all users will be able to opt-in to Quiet Mode, teens will be proactively prompted to turn on Quiet Mode when they spend time on Instagram at night.