Meta takes ateps to address teen safety concerns by restricting content amid growing backlash

Meta, formerly known as Facebook, announced on Tuesday that it will implement new measures to limit the type of content visible to teenagers on Facebook and Instagram. The move comes in response to increasing claims that Meta’s products are addictive and have detrimental effects on the mental well-being of younger users. The company aims to provide age-appropriate experiences for teens on its platforms, addressing concerns raised by lawmakers, attorneys general, and mental health advocates.

In a blog post, Meta detailed the upcoming changes designed to give teenagers more age-appropriate experiences on Facebook and Instagram. These protective measures include defaulting teenage users to the most restrictive settings, restricting their ability to search certain topics, and prompting updates to Instagram privacy settings. The update, expected to be completed in the coming weeks, will prevent users under the age of 18 from accessing content related to self-harm, eating disorders, or content featuring restricted goods or nudity, even if shared by someone they follow.

The decision to implement these new protections follows legal challenges faced by Meta, including a lawsuit filed by a bipartisan group of 42 attorneys general in October. The lawsuit alleges that Meta’s products contribute to mental health problems, such as body dysmorphia and eating disorders, among teenagers. The attorneys general claim that Meta knowingly designed its platforms with manipulative features that contribute to addiction and lower self-esteem in young users.

Meta whistleblower Arturo Bejar, in Senate subcommittee testimony in November, affirmed that the company was aware of the harm its products cause to young users but failed to take adequate action to address the issues. These concerns have been ongoing, with similar complaints dating back to 2021 when Facebook, before rebranding as Meta, faced criticism based on revelations from whistleblower Francis Haugen about the harmful impact of Instagram on teenagers.

While Meta did not explicitly state the catalyst for the latest policy change, it emphasized in the blog post that the company regularly consults with experts in adolescent development, psychology, and mental health. The goal is to create safe and age-appropriate platforms for young people, acknowledging the need to address specific types of content that may be deemed inappropriate for teens.

The ongoing criticism and legal challenges have put Meta under heightened scrutiny, prompting the company to reassess its approach to content moderation and user safety. As the update to restrict content for teenagers is rolled out in the coming weeks, Meta will likely face continued scrutiny over the effectiveness of these measures and whether they sufficiently address the concerns surrounding the impact of social media on the mental well-being of young users.

Share this article
Shareable URL
Prev Post

South Korea enacts historic ban on dog meat trade in response to shifting animal welfare attitudes

Next Post

U.S. Secretary of State Antony Blinken expresses concern over high civilian casualties, especially children, in Gaza

Leave a Reply

Your email address will not be published. Required fields are marked *

Read next