Meta Looks to Appease Parents
The Big Tech giant rolls out new harmful content restrictions for teens as it faces a massive lawsuit from 40 states.
Social media is not good for kids. This is not a news flash to anyone who has been paying any attention to the development of social media over the last 15 years.
Indeed, Big Tech has been aware of this reality for years, but Meta, the parent company of Instagram and Facebook, is only now taking action.
Meta is rolling out new restrictions targeting teen accounts that are intended to prevent certain harmful and damaging content from being accessible to users under age 18.
According to Meta, these new restrictions will automatically filter out content from being seen on a teen’s account feed. The specific type of content restricted will be videos and posts that depict or discuss self-harm, graphic violence, and eating disorders. These new content restrictions will be automatically applied to existing accounts and new accounts based on an account user’s birth date. Furthermore, unlike in the past, these restrictions cannot be removed before a user’s 18th birthday.
It would appear that Meta has been motivated to act now just as more than 40 states are currently suing the Big Tech company, which they say knew about the negative impact of social media on minors and yet publicly denied it.
When the bipartisan suit was raised last October, led by attorneys general from Colorado and Tennessee, Meta initially responded by arguing it had been investigating the problem and seeking to work with states to improve platforms for young people. As a Meta spokesman stated, “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”
However, according to the lawsuit, Meta had known for a while that its platform was especially toxic for teen girls.
The issue boils down again to questions surrounding Section 230 of the Telecommunications Act of 1996 and its protections for social media platforms. We have long argued that Big Tech has abused this provision, which is designed to protect a platform provider from culpability for content users may post. However, Meta has sought to have it both ways by creating its own acceptable content rules, which it uses to censor viewpoints with which it disagrees.
Our humble little publication has been repeatedly abused by Facebook over its arbitrary, opaque, and constantly evolving content moderation standards. The vast majority of our 750,000 followers are simply not presented with most of our content. Facebook wants to play the role of publisher when it wishes to suppress and censor speech — most often conservative speech — that it finds objectionable.
The interesting thing here is that Meta, by effectively caving and producing new age restrictions on its site, is effectively admitting that it is indeed a publisher rather than a mere open platform. Therefore, this would presume to eliminate the Section 230 protections Meta has been hiding behind.
How this lawsuit is eventually decided should have a significant impact on the social media world in general. While seeking to prevent children and teenagers from accessing harmful content on social media is desirable, there are inherent problems at play, specifically regarding the role of parents and free speech infringement implications.
Congress needs to intervene on this issue, both to aid parents and to protect free speech. The current problem is that Big Tech is failing on both of these fronts.
- Tags:
- parents
- free speech
- Big Tech