Meta, the parent company of Facebook and Instagram, has announced plans to only show teenagers age-appropriate content on both platforms.

Advertisement

In a recent blog post, Meta said teenagers will be placed into “the most restrictive” content control settings on Instagram and Facebook.

The social media giant added that it will hide certain types of content and restrict the use of additional terms in the ‘Search’ and ‘Explore’ features.

It said content relating to suicide, self-harm, and eating disorders will become harder to find and will not be recommended.

Advertisement

Meta said the development aims to make the social media platforms safe and age-appropriate for young people.

This, it said, is in line with the guidance of experts in adolescent development, psychology, and mental health.

“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.

Advertisement

“We have developed more than 30 tools and resources to support teens and their parents, and we have spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive.

“We are announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.

“Now, when people search for terms related to suicide, self-harm and eating disorders, we will start hiding these related results and will direct them to expert resources for help.”

Advertisement


Copyright 2024 TheCable. All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express written permission from TheCable.

Follow us on twitter @Thecablestyle