Adrian Swancar on Unsplash " />Boy on phone Instagram has unveiled a new feature which will alert parents if their teenager repeatedly tries to search for terms related to suicide or self-harm. The feature is being rolled out in the coming weeks and will provide caregivers with information to help support their teen and talk to them about it. Currently, if someone tries to search for suicide and self-harm content on Instagram, the social media platform’s policy is to block these searches and direct them to resources and helplines that can offer support. How will the new alert work? Now, in addition to the blocked content feature, if someone using a Teen Account repeatedly tries to search for terms related to suicide or self-harm within a short period of time, their parent will receive a notification. The alerts will be sent via email, text, or WhatsApp – depending on the contact information available – as well as through an in-app notification. Tapping on the notification will open a full-screen message explaining that their teen has repeatedly tried to search Instagram for terms associated with suicide or self-harm within a short period of time. Parents will also have the option to view expert resources designed to help them approach potentially sensitive conversations with their child. Attempted searches that would prompt the alert include phrases promoting suicide or self-harm, phrases that suggest a teen wants to harm themselves, and the actual terms ‘suicide’ or ‘self-harm’. These alerts will roll out to parents who use Instagram’s parental supervision tools in the US, UK, Australia, and Canada next week, and will become available in other regions later this year. Why is it needed? The rollout comes one week before the release of Channel 4 documentary Molly Vs The Machines, which revisits the death of 14-year-old Molly Russell, who took her own life in 2017 after months of seeing content relating to self-harm and suicide online. The Standard notes that Molly had saved, liked and shared 16,300 pieces of content on Instagram in the six months leading to her death – of these, 2,100 were about self-harm, depression and suicide. She had also searched for similar content on Pinterest. Both social media platforms now block this type of content from searches. In cases where content encourages suicide, self-injury or eating disorders, it is removed. In 2023, The Online Safety Act came into force with a new set of laws to protect children and adults online. As part of the act, social media companies and search services have a duty to protect users – especially young people. Platforms have to prevent children from accessing harmful and age-inappropriate content, and provide parents and children with clear and accessible ways to report problems when they do arise. Companies which don’t meet these requirements can be fined up to £18 million or 10% of their qualifying worldwide revenue (whichever is greater). Vicki Shotbolt, CEO of Parent Zone, said of the latest announcement: “It’s vital that parents have the information they need to support their teens. “This is a really important step that should help give parents greater peace of mind – if their teen is actively trying to look for this type of harmful content on Instagram, they’ll know about it.” Meta, which owns Instagram, said it is now working on building similar parental notifications for teens’ conversations with AI. Help and support: Mind , open Monday to Friday, 9am-6pm on 0300 123 3393 . Samaritans offers a listening service which is open 24 hours a day, on 116 123 (UK and ROI - this number is FREE to call and will not appear on your phone bill). CALM (the Campaign Against Living Miserably) offer a helpline open 5pm-midnight, 365 days a year, on 0800 58 58 58 , and a webchat service . The Mix is a free support service for people under 25. Call 0808 808 4994 or email help@themix.org.uk Rethink Mental Illness offers practical help through its advice line which can be reached on 0808 801 0525 (Monday to Friday 10am-4pm). More info can be found on rethink.org . Related... UK Social Media Ban For Under-16s: Parents And Experts Share Their Views I'm A Therapist. I Don't Think Banning Social Media For Kids Is The Only Answer Instagram's Now 'PG-13 Rated' For Teens – But What Does That Actually Mean?