
Meta is updating its Facebook and Instagram apps to offer teens more online protection. The company said the updates will add to its existing list of more than 30 parental monitoring tools aimed at protecting young users.
The social networks will now hide results related to suicide, self-injury and eating disorders. When users search for this content in the apps, the platforms will direct them to resources from organizations such as the National Alliance on Mental Illness, according to Meta. The company noted that it no longer recommends this type of content in “Reels” and “Explore,” and that the new modifications will also prevent teens from seeing it in “Feed” and “Stories,” even if it is shared by a person they follow.
The company launched the “Sensitive Content Control” option on Instagram and “Reduce” on Facebook in 2021 for new teen users only, and now it will automatically place all teens (both current and new users) into the most restrictive content control setting.
Other updates Meta previously rolled out for teens include the option to disable location sharing and a privacy setting for those under 16, which allows them to hide their friends list and posts they are tagged in, as well as select who can comment on their public posts.
The modifications come at a time when Meta is facing criticism for teen protections on their platforms. Parents have sued Meta multiple times, alleging that their children were bullied on social networks and developed eating disorders because of them. In October, Meta was also sued by 33 attorneys in the U.S. for addictive features geared at kids and teens.
Photo via Unsplash