Instagram will begin notifying parents when their teenage children repeatedly search for content related to suicide or self harm, as parent company Meta intensifies efforts to address growing concerns about youth safety on social media platforms.
The new feature, announced on Thursday, will roll out in the United States, Britain, Australia and Canada in the coming weeks, with plans to expand to other regions later in 2026. Alerts will be triggered when a teenager makes multiple searches for suicide or self harm related terms within a short period, signalling potential distress.
Parents who use Instagram’s parental supervision tools will receive notifications through email, text messages or WhatsApp, as well as directly within the app. Alongside the alerts, Instagram will provide access to expert resources designed to help parents initiate sensitive and supportive conversations with their children.
Instagram, which is owned by Meta, already restricts searches linked to suicide and self harm by blocking such terms and redirecting users to help lines and support organisations. According to the company, the new alert system is meant to identify cases where teens persistently attempt to access such content despite existing safeguards.
Meta said it worked closely with its Suicide and Self Harm Advisory Group to determine when alerts should be sent. The company explained that it intentionally set a cautious threshold, even if that means some parents may receive notifications in situations that do not ultimately indicate serious risk.
The announcement comes amid increasing legal and regulatory scrutiny over the impact of social media on young users. Earlier this month, Meta Chief Executive Officer Mark Zuckerberg testified in a landmark trial in California involving allegations that major technology companies knowingly contributed to addictive behaviours among minors. The case marks the first time such claims have been presented before a jury.
Beyond the United States, Meta is also navigating a broader global push to limit children’s access to social media platforms. Australia has already introduced a ban on under 16s using social media, while countries such as France, Denmark, Spain and the United Kingdom are moving to implement similar restrictions.
With mounting pressure from regulators, courts and parents, the new parental alert system reflects Meta’s attempt to demonstrate stronger safeguards for teenage users while balancing concerns around privacy, mental health and online safety.

