Meta has rolled out its first major update of 2024 to Facebook and Instagram.
The company has introduced new tools to protect young users of Facebook and Instagram.
The parent company of Facebook and Instagram said in a blog post that new policies are being implemented to protect young people.
Under the new policies, it will be more difficult for users under the age of 18 to access sensitive content such as suicide, self-harm and eating disorders through features such as Search and Explore.
The company said that it will change the settings of all the youth’s Facebook and Instagram accounts to make it difficult to access sensitive content.
“We want young people to have access to safe and age-appropriate content in our apps, so we’re announcing additional protections for younger users,” the company said.
According to the blog, if a young user follows an account that posts on sensitive topics, those posts will not appear on that young person’s feed.
These measures will be implemented in the coming weeks.
These steps are being taken by the company at a time when it is being criticized by various countries.
Meta has been accused of causing a mental health crisis among the younger generation with its apps.
That’s why in October 2023, 33 US states filed a lawsuit against Meta, saying that Meta is misleading the public about the dangers of its social media platform.
The European Commission has also directed Meta to clarify what measures are being taken to protect children from illegal and harmful content.