Social Media Trap Gets Marched to the Courthouse
Meta and Google are having to prove that their social media apps aren’t addictive and detrimental to adolescent mental health.
The first of the bellwether trials is underway for Big Tech’s role in creating apps so addictive that they’re hurting the mental health of America’s youth.
On Monday in Los Angeles, the trial involving Meta and Google began. One plaintiff, referred to as “KGM,” is a 20-year-old who believes that social media addiction led to her psychological issues, including depression, body dysmorphic disorder, and self-harm. She began using YouTube, owned by Google, at the tender age of six and had access to Instagram, owned by Meta, at age nine. According to the Associated Press, “Before she graduated elementary school, she had posted 284 videos on YouTube.”
Meta CEO Mark Zuckerberg will personally testify on February 18, and the parents of children who have died because of their addiction to social media are not going to make it easy for the tech billionaire. They have already lined the courthouse steps with pictures of their dead children, making their pain unavoidable.
The trial will specifically address whether or not the jury finds Meta and Google liable for the deterioration of KGM’s mental health due to the addictive nature of their platforms.
There are several problems with this line of thinking. “The first problem with these cases is that Section 230 of the 1996 Communications Decency Act says internet platforms can’t be held liable for user-generated content,” The Wall Street Journal editorial board points out. Social media companies love to hide behind Section 230 to justify censorship (which, by the way, is an admission that social media changes behavior).
Furthermore, there is nothing that definitely proves social media features — e.g., scrolling through short-form videos, auto-play, or the “like” button — cause mental illness in teens. Researchers have found that adolescents struggle with mental health issues regardless of age or social media use. What is undeniable, though, is that the algorithms often do lead people down terrible rabbit holes.
Instagram, in particular, is notorious for creating the conditions for pedophiles to wreak havoc on social media. In a chilling 2023 exposé, The Wall Street Journal uncovered yet another pedophile ring using Instagram to curate content and arrange meet-ups with exploited children. As I noted then:
According to the Journal, Meta accounts for 85% of reported child pornography to the National Center for Missing & Exploited Children — a nonprofit that helps to identify and take down such images of children.
Meta has sophisticated algorithmic tools in place to track and stop this explicit, exploitative, and illegal activity on Instagram. The algorithms have, in fact, previously taken down 27 similar pedophile networks and actively suppress hashtags that are linked to pedophilic content. However, three WSJ journalists were able to tap into this evil filth with relative ease.
That network literally had menus of content available for viewing and purchase. It even offered “meet ups” with some of these exploited kids. Thanks to Instagram, this content is basically out in the open and can be stumbled upon, and continues to be added to your feed. But it’s really even worse than that. As Not the Bee points out, “Instagram’s algorithm also curated pedophilic content and connected people with similar interests, essentially facilitating the sharing of child sexual abuse material.”
On the other hand, “These trials serve as a critical inflection point,” New York-based trial attorney James Rabinowitz explained to National Review. “Section 230 will not save the huge tech companies that we love and hate, because the plaintiffs are not suing over what users posted. They are suing over how the product itself was engineered. That is product liability.”
Many have compared these cases to lawsuits against tobacco companies in the 1990s. However, that leaves out an important element of this case: that this young woman was given access to social media at an extremely young age. Parents can sue the companies for making their product too addictive, but couldn’t the defendants simply countersue for parental neglect? Why was KGM allowed to have access to YouTube at age six or Instagram at age nine? What parent in their right mind believes that’s a good idea?
It’s an interesting case, and the result will guide the rest of the pending cases nationwide. Should jurors rule in favor of the young woman, social media’s Section 230 protections are well and truly gone. The companies as we know them will likely be sued into the ground. But if the verdict goes in favor of Big Tech, adolescent social media use will remain a problem, and the child “protections” companies like Meta put in place are little more than useless.
Ultimately, Congress should repeal Section 230 and even consider a social media ban for children similar to Australia’s. Moreover, parents should be held accountable for their children’s social media use. Either way, something’s got to give vis-à-vis this social media trap.