Meta said it will limit content for teenagers due to growing concerns about negative impacts on children.

Meta said it will default teenage Facebook and Instagram users to the most restrictive content settings.

0

As the business deals with growing allegations that its programs are addictive and detrimental to the mental health of younger users, Meta said on Tuesday that it will restrict the kind of information that teens on Facebook and Instagram may see.

The additional safeguards are intended “to give teens more age-appropriate experiences on our apps,” according to a blog post by Meta. According to the business, the upgrades would force adolescent users to update their Instagram privacy settings, stop them from looking for certain topics, and set their default to the most restrictive settings.

It stated that Meta hopes to finish the upgrade in the upcoming weeks, shielding minors under the age of eighteen from “content that discusses struggles with eating disorders and self-harm, or that includes restricted goods or nudity,” even information published by an account they follow.

The alteration was made in response to a bipartisan coalition of forty-two attorneys general’s announcement in October of their intention to sue Meta. The group claims that Meta’s goods harm adolescents and exacerbate mental health issues such as eating disorders and body dysmorphia.

Attorney General Letitia James of New York said in a statement announcing the cases that “social media companies like Meta are to blame for the record levels of poor mental health that kids and teenagers are suffering from.” “Meta has benefited from the suffering of children by purposefully including manipulative elements into its platforms, which cause children to become dependent on them and lose confidence in themselves.”

Arturo Bejar, a Meta whistleblower, testified before a Senate panel in November, informing senators that while the firm was aware of the risks its products posed to minors, it did not take the necessary steps to address the issues.

The corporation has faced similar criticism since 2021, when it was still known as Facebook before rebranding to Meta. That September, an explosive Wall Street Journal story based on papers leaked by Francis Haugen, the whistleblower, revealed that Facebook had discovered repeatedly that a large number of youngsters were being harmed by its Instagram social network platform. Later, Haugen appeared before a Senate hearing, claiming that Facebook habitually prioritizes its financial gains over the security and well-being of its users—partially because of algorithms that direct users toward highly engaging content.

Facebook put a halt to the development of an Instagram for youngsters service, which was intended for youngsters between the ages of 10 and 12, in response to the outcry. Since then, the business has not offered an update on its intentions.

In a blog post on Tuesday, Meta stated that it regularly confers “with experts in adolescent development, psychology, and mental health to help make our platforms safe and age-appropriate for young people, including improving our understanding of which types of content may be less appropriate for teens.” The company did not specify what led to the most recent policy change.

Thank you for reading this post, don't forget to follow my whatsapp channel


Discover more from TechKelly

Subscribe to get the latest posts sent to your email.

Leave A Reply

Your email address will not be published.

Discover more from TechKelly

Subscribe now to keep reading and get access to the full archive.

Continue reading