Technology

‘I feel it’s a friend’: quarter of teenagers turn to AI chatbots for mental health support

A new study by the Youth Endowment Fund reveals that nearly 40% of 13- to 17-year-olds in England and Wales affected by youth violence are turning to AI chatbots for mental health support. This trend, exemplified by teens like Shan who found AI safer and more accessible than traditional services,...

‘I feel it’s a friend’: quarter of teenagers turn to AI chatbots for mental health support

In an increasingly digital world, young people grappling with profound trauma and mental health challenges are finding solace in an unexpected quarter: artificial intelligence. A groundbreaking study has shed light on a burgeoning trend, revealing that a significant number of teenagers, particularly those impacted by youth violence, are turning to AI chatbots for mental health support, often due to the perceived shortcomings of traditional services.

A Digital Lifeline for Traumatized Youth

The stark reality of youth violence leaves indelible scars, and for many young people, the path to healing is fraught with obstacles. Take Shan, an 18-year-old from Tottenham, whose world was shattered by the fatal shooting of one friend and the stabbing of another. Conventional mental health services, she found, simply didn't meet her urgent needs. Instead, she turned to an AI chatbot, affectionately calling it "chat" and even "bestie." For Shan, this digital confidante felt safer, less intimidating, and crucially, always available to help her navigate the overwhelming grief and trauma.

Shan's experience is far from isolated. Research commissioned by the Youth Endowment Fund (YEF) among more than 11,000 young people in England and Wales paints a compelling picture: approximately 40% of 13- to 17-year-olds affected by youth violence are now using AI chatbots for mental health support. The study further indicates that both victims and perpetrators of violence are markedly more inclined to use AI for such support compared to their peers not involved in violent incidents.

The Unmet Demand: Why Young People Choose AI

The surge in AI chatbot usage among vulnerable youth isn't merely a technological fad; it's a symptom of a systemic issue within mental health provision. Several critical factors are driving this shift:

  • Accessibility and Availability: Traditional mental health services are often plagued by extensive waiting lists, sometimes stretching for months or even years. As one anonymous young person told The Guardian, "The current system is so broken for offering help for young people. Chatbots provide immediate answers. If you’re going to be on the waiting list for one to two years to get anything, or you can have an immediate answer within a few minutes… that’s where the desire to use AI comes from." Shan highlighted the 24/7 accessibility of AI, available with just "two clicks on her smartphone."
  • Perceived Privacy and Confidentiality: For young people, especially those involved in or affected by criminal activities, privacy is paramount. Shan felt the AI wouldn't disclose her conversations to teachers or parents, a concern stemming from past experiences where she believed her confidences were shared. Similarly, boys involved in gang activities found chatbots safer for seeking advice on alternative income streams, fearing that human adults might leak information to police or rival gangs, endangering them.
  • Reduced Intimidation and Judgment: Many young people find conventional therapy settings intimidating or feel judged by human professionals. Shan described her AI "friend" as "less intimidating, more private and less judgmental" than her experiences with NHS and charity mental health support. The ability to interact with an AI in a conversational, informal manner – as Shan does with "Hey bestie, I need some advice" – fosters a sense of comfort and non-judgmental acceptance.
  • Lack of Empathy in Traditional Services: The YEF findings suggest that chatbots are fulfilling a demand unmet by conventional services, which some young users perceive as lacking in empathy or understanding.

A Call for Human Connection: Expert Warnings

While the accessibility and perceived safety of AI chatbots offer a temporary balm, youth leaders are sounding the alarm. Jon Yates, chief executive of the Youth Endowment Fund, articulated a clear message: "Too many young people are struggling with their mental health and can’t get the support they need. It’s no surprise that some are turning to technology for help. We have to do better for our children, especially those most at risk. They need a human not a bot."

This sentiment underscores a fundamental concern: can an algorithm truly replace the nuanced empathy, understanding, and professional guidance that a human therapist provides? The consensus among many experts is a resounding no. Human connection, validation, and the ability to navigate complex emotional landscapes are critical components of effective mental health support that AI, in its current form, cannot fully replicate.

The Double-Edged Sword: Risks and Regulation

The rapid adoption of AI for sensitive mental health issues also brings significant risks and ethical dilemmas. Hanna Jones, a youth violence and mental health researcher in London, acknowledges the allure of AI: "To have this tool that could tell you technically anything – it’s almost like a fairytale. You’ve got this magic book that can solve all your problems. That sounds incredible." However, her primary concern, shared by many in the field, is the alarming lack of regulation.

"People are using ChatGPT for mental health support, when it’s not designed for that," Jones emphasized. Unlike licensed therapists, AI chatbots lack the training, ethical frameworks, and accountability mechanisms necessary to provide safe and effective mental healthcare. This unregulated environment opens the door to potential misinformation, inappropriate advice, or even exacerbating a user's distress.

Industry Response and Ongoing Challenges

Companies like OpenAI, the developer behind ChatGPT, are not unaware of these concerns. The company faces several lawsuits, including from families of young people who have tragically taken their own lives after extensive engagements with their chatbots. In the case of 16-year-old Adam Raine, who died by suicide, OpenAI has denied direct causation but acknowledges the gravity of the situation.

In response, OpenAI states it has been actively improving its technology "to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support." Furthermore, the startup announced in September that it could begin contacting authorities in cases where users discuss suicide seriously. While these are steps in the right direction, they highlight the reactive nature of current safeguards and the immense responsibility placed on AI developers.

Jones advocates for a proactive and inclusive approach to regulation: "What we need now is to increase regulations that are evidence-backed but also youth-led. This is not going to be solved by adults making decisions for young people. Young people need to be in the driving seat to make decisions around ChatGPT and mental health support that uses AI, because it’s so different to our world. We didn’t grow up with this. We can’t even imagine what it is to be a young person today."

Moving Forward: A Hybrid Approach?

The rise of AI in youth mental health is a complex phenomenon, reflecting both the innovative potential of technology and the critical deficiencies in existing support systems. While AI chatbots offer immediate, private, and non-judgmental avenues for expression, they are not a panacea. The core message from youth leaders remains clear: human connection and professional care are indispensable.

The path forward likely involves a hybrid approach: leveraging AI's strengths for initial support, information, and de-escalation, while simultaneously investing heavily in robust, accessible, and empathetic human-led mental health services. Crucially, any integration of AI must be guided by stringent, youth-informed regulations to ensure safety and effectiveness, safeguarding the well-being of a generation growing up with unprecedented digital tools.

Need Support?

If you or someone you know is struggling with mental health, please reach out for help:

Related Articles