New tech putting mental health support in the hands of young people
Dr Imogen Bell and young person in VR Lab
Since COVID, mental health services have been overwhelmed, inspiring an explosion of more than 10,000 self-help apps that offer instant anonymous advice, or even a chat with an emotionally intelligent bot. But can any of it really take the place of a human?
Dr Imogen Bell, research fellow at Orygen Digital, the technology arm of Orygen (Australia’s centre for excellence in youth mental health - led by Professor Pat McGorry), understands the potential of AI to support young people’s mental health. A trained psychologist herself, Imogen leads a team that works with experts to develop and test digital tools that can make a positive difference in young lives.
Orygen’s last two projects, a digital mental health service called MOST (now funded by four state and territory governments) and an app called Mello, were both supported by Telstra Foundation. In addition, Telstra Foundation's Young & Connected Fund is now funding Orygen Digital’s latest work testing the use of virtual reality as the next frontier in mental health.
Imogen’s research shows that over 90% of Australian psychologists and other mental health professionals welcome the use of digital technologies. Orygen’s researchers, have taken this an extra step by creating a vivid mix of interactive experiences called MIND, which enable young people to immerse themselves in various situations from the safety of a virtual reality (VR) headset.
By putting on the headset, the user is transported to a busy social setting like a classroom or train, where they’re confronted with situations that may trigger feelings of anxiety or vulnerability that they can scrutinise and respond to on-screen.
The situations and emotions in MIND have all been inspired by and designed for young people. Users guide their own journeys, can type in their own thoughts, and take it home after they’ve used it with a therapist. “The key difference in VR is that you can learn and practice skills in environments that are relevant to you, not just in a clinical setting explains Imogen. “It’s possible to create any sort of environment, not just ones that replicate the real world. Imagine a calming environment where you can practice letting go of struggles… the opportunities for therapy go far beyond the text and images other technologies offer.”
Defusing negative thoughts
MIND uses a form of therapy called ‘defusion’, which lets users separate themselves from negative thoughts and look at them objectively, breaking the loops of worry. It’s a more practical way of looking at your actual thoughts – rather than the abstract skills or strategies you might receive from a therapist.
Orygen’s Young & Connected grant will enable them to take their prototype through an intensive phase of development and testing, in conjunction with innovation partners Liminal and Superunknown. Over the next 12 months, MIND will pass through two more rounds of development, prior to a nationwide trial that will put MIND 2.0 into the hands of nearly 500 young people, alongside Orygen’s clinical partners in the Headspace network.
“Online therapies are skyrocketing but very few have research backing, particularly in the VR space,” says Imogen. “This is why a partnership with Telstra Foundation is so empowering, because they have the knowledge and connections to really promote our work, create new partnerships, support our business development and co-design processes… we wouldn’t have got here without them.”
This element of collaboration with both young people and service providers, underscores the Young & Connected Fund’s focus on helping youth understand and use new technologies to improve their lives. In August and September, the Foundation hosted a series of training events that brought together stakeholders from across the mental health sector to share insights on the responsible use of AI in youth mental health.
The event was facilitated by Gradient Institute, a leading Australian authority on the ethical use of AI, which has embarked on a research project (also backed by Young & Connected) to study the use of AI technology in chatbots for youth mental health. The Australian-first study will identify issues and risks in the use of AI, particularly generative models like that used by ChatGPT - which are already informing several apps and chatbots offering advice to young people.
A litany of risks
But the rise of generative AI powered bots is worrying some, “while generative AI chatbots have potential for mental health support, their insidious risks should not be overlooked in the race to adopt them,” says Yaya Lu, Software Engineer from Gradient Institute. “In the youth mental health space, the lack of escalation and tailored, clinically-approved treatments could potentially be life-threatening for at-risk individuals.”
In the first few months of their research, Yaya and Gradient CEO, Bill Simpson Young have already discovered evidence of bots:
- delivering harmful advice – including advice in grooming situations, and the masking of drug and alcohol consumption
- “hallucinating” or providing fake citations and case studies
- recording users’ information without their explicit consent
- collecting data and IP addresses that could be used for covert advertising
- proclaiming ‘love’ for users and even allegedly encouraging suicidal ideation.
“There are lots of examples of nefarious behaviour from chatbots that are accessible to teenage kids… while it’s likely unintended, this behaviour demonstrates the huge potential for manipulation of content by generative AI” says Yaya.
By the end of October, the Gradient team will have developed the scope for a study that will consider the implications of generative AI bots for young Australians’ wellbeing.
The study will encourage collaboration between bot developers, researchers, and mental health professionals and aims to reach a conclusion about the need and scope for further action, whether it be: training for clinicians, guidelines for developers, or formal legislation.
Gradient’s CEO Simpson Young says, “We’ve talked to many of the major Australian organisations working in youth mental health, and pretty much all of them say it’s perfect timing and that there’s an urgent need. A lot of organisations are very attracted to the potential of generative AI chatbots to support their overstretched services, but we want to make sure people understand the risks and go into this with their eyes wide open before they start using them at all.”
“There are clear scenarios where generative AI chatbots can support youth care, like helping children with autism practise conversations, or assisting psychologists or helpline staff with summarising case notes for their patients,” adds Yaya. “But there are also many areas where using generative AI to provide advice to a vulnerable young person is really risky – and it cannot replace the value that comes with interacting with an empathetic and qualified human being.”