Meta plans to hide posts about suicide and eating disorders from teens' Instagram and Facebook feeds.
The company said that, while it already aims not to recommend such “age-inappropriate” material to teens, now it also won't show it in their feeds, even if it is shared by an account they follow.
Meta said it will start hiding inappropriate content from teenagers' accounts on Instagram and Facebook, including posts about suicide, self-harm, and eating disorders, the Associated Press (AP) reported.
The social media giant based in Menlo Park, California, said in a blog post that while it already aims not to recommend such “age-inappropriate” material to teens, now it also won't show it in their feeds, even if it is shared by an account they follow.
“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.
Teen users — provided they did not lie about their age when they signed up for Instagram or Facebook — will also see their accounts placed on the most restrictive settings on the platforms, and they will be blocked from searching for terms that might be harmful.
“Take the example of someone posting about their ongoing struggle with thoughts of self-harm. This is an important story and can help destigmatize these issues, but it’s a complex topic and isn’t necessarily suitable for all young people,” Meta said.
"Now, we’ll start to remove this type of content from teens’ experiences on Instagram and Facebook, as well as other types of age-inappropriate content."
Meta faces lawsuits from dozens of US states that accuse it of harming young people and contributing to the youth mental health crisis by knowingly and deliberately designing features on Instagram and Facebook that addict children to its platforms.
Comments