
Meta-owned Instagram will soon alert parents if their teenage child uses the app to search for content related to suicide or self-harm, the technology company’s latest effort to shore up safety features as it faces heat over how social media impacts young people.
Meta said that, starting next week, parents who use Instagram’s supervision tools will get a message — either via email, text or WhatsApp, as well as through an in-app notification — if a teen repeatedly searches for certain terms related to self-harm or suicide within a short time span.
The company said the message will inform parents that teens repeatedly searched for suicide or self-harm content and offer resources on how to approach sensitive conversations around mental health.
“The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to
...Keep reading this article on CBSNews.