The parents of a young woman afflicted with a severe eating disorder are blaming Instagram in a lawsuit. Instead of blaming the content, however, they are targeting the accusations at the platform’s algorithms in an attempt to skirt around the Section 230 code.
The lawsuit filed in California federal court alleges Instagram’s parent company Meta purposely crafted products to addict young users, steering one 11-year-old girl down a years-long path of physical and psychological harm.
The court case, brought about on behalf of now-19-year-old Alexis Spence, asserts Instagram “consistently and knowingly” targeted its product at young children while at the same time ignoring warnings internally about its worsening effects on the mental health of its users.
The parents argue that due to Alexis’ long and extended addiction to Instagram, she has had to receive extensive counselling, outpatient consultations, overnight hospital stays, attend eating disorder programs, and will likely have to have a service dog and continual mental health monitoring for the rest of her life.
The lawsuit directly cites the leaked internal Facebook report from 2021 which included confidential reports and presentations portraying Instagram as a blight on the mental health of adolescents.
The suit is also the most recent case hoping to find a way around the liability shield passed in 1996 given to website owners and operators under Section 230 of the Communications Decency Act. It effectively means that owners of a website where 3rd parties can post content are not responsible or liable for the effects of that content.
The online world is an entirely different landscape now than it was when that law was passed, and one could argue that the huge tech and social media companies have benefitted from that bulletproof law at considerable cost to not just their users, but to their moderators as well.
“There’s a concerted effort across the country to re-frame lawsuits against internet services as attacking their software tools, not attacking the content that’s published using them,” Eric Goldman, a law professor at Santa Clara University, told Gizmodo.
This is of course not the first lawsuit brought against Meta since the report was leaked. The sad stories of teens taking their own lives have filled the news with distraught parents lashing out at the social media giant, begging them to take more responsibility towards their younger users. The negative messages across social media have even become the object of awareness campaigns designed to counter the harm caused.
If successful it may become a landmark case because the lawyers are targeting Instagram’s algorithm rather than the specific content consumed. It’s been reported for some time now that the algorithm skews towards directing more of the same content to viewers and is deliberately designed to become psychologically addictive.
“Again, you’re talking about the algorithm and the way that the complaint may be framed is really more about the overall service, that everything about the service was designed to encourage usage and that encouraged amount of usage is what caused the problem,” Goldman says. “It doesn’t mean they’ll win, but they may have found a way to get around Section 230.”
As parents it’s difficult to always keep an eye on what your teens are watching. And it’s far more difficult now that so many have their own devices, with multiple channels available. Social media is such a new thing that for some it can be difficult to grasp the seriousness of how it can affect young pliable minds.
I was 26 years old when Facebook started, already an adult. I cannot fathom growing up in today’s online environment, or the distinct challenges that it poses. The mind of a 10 to even a 19 year old is still in the developmental stages and far more susceptible to manipulation than the adult brain. Parents cannot monitor their children’s every activity 24/7. It is time that the big tech companies stepped up and were held accountable as well.