The Huffington Post
Diogo Monteiro on Unsplash " /> At 1.45am, a child somewhere in the UK opened an anonymous chatbot on their phone. We don’t know their name. We never will. That is the point. They typed into it something they could not bring themselves to say to any of us – their parents , teachers, friends. They used Quinly , the safeguarding tool I built. By morning, they had been signposted to their school’s safeguarding lead, to Childline , and to Papyrus . We know this because the conversation log was deleted the moment the tab closed. That is how the product is designed. If you are a parent reading that and feeling a weight in your chest, please hear this: it is not your failure if your child doesn’t tell you first. Children don’t hide things from their parents because they love us less. They hide them because they love us, and they are terrified of the look on our faces when they reveal what’s wrong. The scale of what our children are hiding should frighten anyone – from sextortion to substance misuse, eating disorders , bereavement, self-harm , sexual abuse, bullying and so on. UK police recorded almost 37,000 child sexual abuse image crimes in 2025 – an 8% rise on the year before, with more than 40% of identifiable offences on Snapchat. Sextortion reports involving under-18s rose 34% in a single year; boys aged 14 to 17 are now 98% of victims. The Internet Watch Foundation identified 3,443 AI-generated child sexual abuse videos in 2025 – a 26,385% rise on the previous year – with 65% classified Category A, the most severe classification in UK law. A deepfake of a real, named child can now be generated from 20 of their existing photos in 15 minutes. And where do our children turn? Almost one in five British teenagers say they have no one they can talk to. CAMHS has more than 385,000 children waiting for a first contact . Childline is remarkable, but it cannot answer every call. So, when you read that Technology Minister Liz Kendall is considering banning all AI chatbots for under-16s , you might reasonably think good . She is right to worry. Companion chatbots engineered to simulate friendship, pretend to be a girlfriend, keep lonely children online until 4am for ad revenue – those products are dangerous, and parents are right to want them gone. But that is not all AI chatbots are. The author I built Quinly because the child who cannot say it at the dinner table is often the same child who cannot sit through a phone queue. Quinly is anonymous. No account. No email. No stored conversation. When the tab closes, the record is gone. It does not pretend to be your child’s friend. It does not keep them online. It signposts every user to an organisation like Childline, Samaritans, Papyrus etc., and the school’s own Designated Safeguarding Lead. It is built to the Department for Education’s generative AI product safety standards. Six UK schools are using it. In six months, it has handled more than 4,400 anonymous conversations. The peak hour is not the school day. It is late evening into night. The government’s currently consulting on putting potential age restrictions on social media and other services such as gaming sites and AI chatbots. If the consultation draws no line between companion chatbots and safeguarding tools, Quinly will be banned alongside the products it was designed to be nothing like. The consultation closes on 26 May 2026. If you are a parent, please read it, please respond, and please write to your MP. Tell them British children need protection from predatory design, and they need protection of the tools already catching children at 1.45am when no one else is there. We can have both, and we must. Ruth Sparkes is the founder of Quinly. Help and support: Childline - free and confidential support for young people in the UK - 0800 1111 Related... 1 In 5 Boys Know Someone Their Age Who's In A Relationship With An AI Chatbot 'No Platform Gets A Free Pass': Government Plans New Crackdown On AI Chatbots And Social Media I Asked Teen Boys Why They Use AI Chatbots. 1 Common Response Alarmed Me
Go to News Site