
Instagram to alert parents if teens search for suicide or self-harm content
Newsy
Instagram will soon begin notifying parents if their teenage children repeatedly attempt to search for suicide or self-harm content on the platform over a short period of time.
Meta, the parent company of the social media platform Instagram, announced Thursday that it will soon begin notifying parents if their teenage children repeatedly attempt to search for suicide or self-harm content over a short period of time.
The Silicon Valley-based company said move adds another safeguard to Instagram’s Teen Accounts and parental supervision features. Meta said the feature will begin rolling out "in the coming weeks" and will include expert resources to help parents have potentially sensitive conversations with their children.
RELATED STORY | Study warns about significant mental health risks of giving smartphones to pre-teens
"The vast majority of teens do not try to search for suicide and self-harm content on Instagram, and when they do, our policy is to block these searches, instead directing them to resources and helplines that can offer support," Meta said in a statement. "These alerts are designed to make sure parents are aware if their teen is repeatedly trying to search for this content, and to give them the resources they need to support their teen."
How it will workBeginning next week, parents and teens enrolled in supervision will be notified about the new alerts. Instagram says searches that would trigger an alert include phrases promoting suicide or self-harm, statements suggesting a teen wants to harm themselves, and terms like “suicide” or “self-harm.”
