AI Therapy Insecurities

I have Insecurities about AI Therapy

December 19, 2024

Do mental health apps really care or are they merely another way to exploit the vulnerable? ANNA BUTTERFIELD explored the issue and came away concerned.

Let’s face it, our brains are messy. We worry, we stress, and sometimes, we just need someone to listen (preferably someone who isn’t going to judge our questionable life choices). So it’s no surprise that in our hyper-connected, tech-obsessed world, AI-powered mental health apps are popping up faster than vape shops. They promise easy access, affordable rates, and a judgement-free zone, all from the comfort of your pyjamas. But before we hand over our deepest anxieties to algorithms and let silicon chips analyse our dreams, let’s take a cue from author Arthur C. Clarke and ask ourselves: are we staring at a digital deity that will solve all our problems, or are we about to watch the stars go out on our sanity?

AI therapy is undeniably alluring because traditional therapy is expensive and difficult to access. So these nifty little apps seem magic offering round-the-clock support, potential anonymity, and a price tag that won’t break the bank.

Some popular apps, like Replika, Woebot, Routinery, and Fabulous, have already attracted millions of users, promising to help us manage everything from anxiety, ADHD to productivity.

AI enthusiasts argue that chatbots could revolutionize mental healthcare, especially for those struggling with mild to moderate conditions like anxiety and depression. After all, much of standard therapy involves active listening, validation, and providing practical coping strategies โ€“ tasks that AI, in theory, could perform.

But… before we all ditch our therapists for digital shrinks, it’s crucial to acknowledge the limitations and potential pitfalls of relying on AI for mental health. But as others much cleverer than I have stated before, we often fall prey to the hype surrounding new technologies, forgetting that they are rarely the utopian solutions they are advertised to be.

The issue of care

One of the most pressing concerns is accuracy and care. While AI is getting better at mimicking human conversation, it still lacks the nuanced understanding and clinical judgement needed to provide safe and accurate advice. The American Counseling Association (ACA) for obvious reasons stated that AI should not be used for diagnosis and should always be supplemented by the expertise of a human counsellor.

A chatbot might be able to offer generic coping strategies, but it can’t dig into the complexities of individual experiences or recognise subtle warning signs that a human therapist would pick up on.

On RNZ, clinical psychologist Louise Cowpertwait discovered when testing a popular mental health app, the AI’s responses were often irrelevant and frustrating, lacking the empathy and validation a person would expect from a human therapist.

The terrifying issue of data privacy

AI apps thrive on data; they collect information about our thoughts, feelings, and behaviours to personalise their responses. But as the data breach of Muah.ai, an AI girlfriend site, demonstrated, even platforms that promise privacy and encryption can be vulnerable to attacks. Imagine your deepest fears and desires, all neatly packaged with your email address and delivered into the wrong hands.

 

Who’s benefitting?

There’s the creeping sense that the mental health space is just another frontier for tech solutionism and commercialisation. As venture capitalists pour billions into AI startups, and “tech bros” preach the gospel of technological utopia, it’s hard not to wonder if the pursuit of profit is overshadowing the genuine need for ethical and effective mental healthcare. The huge number of apps flooding the market, all promising quick fixes for complex problems, raises concerns about quality control and the potential for users to be misled or even harmed.

 

Putting AI apps to the test

Swamped by ads and fake affiliate recommendations, I thought I’d sign up and check out one of these apps. I did the super generic online quiz that was going to decide just how much help I needed with my executive functioning. Got sent a link to a discount a special 50% off if I sign up in the next 10 minutes. Nothing like an app to help ADHD that has distinctly ADHD-unfriendly dark patterns. USD$40 a month or sign up for 6 months at $90. That’s a little pricey for something I don’t even know what it does. I google the names of the ‘professionals’ said to be behind the programme. Founders of tech startups like these ALWAYS have a social media presence. These don’t.

Insecurities about AI Thgerapy

Then I get a second email, signed by a Behavioural Psychologist (also not legit) claiming to have personally read my results, of said generic test, and been so alarmed by those results that he is now offering me a 60% discount.

Insecurities about AI Therapy

This continues in precisely timed intervals for a week until I unsubscribe, because apparently avoidance behaviours is my thing. So deeply grounded in the care and consideration of their potential customers? Or simply trying to make money through AI-written, bland copy and a boring user interface? I didn’t sign up, so I can’t rightly say. But these people have an opinion. And these. And these. And these.

 

Not so fabulous

Being a tech-savvy woman of reasonably-sound mind, another app sign-up renewed at $60 after I couldn’t find a way to cancel the free trial. Then, through some form of trickery, they managed to sign me up to another app from their stable of tools that then charged me another $60. The refund for the first I received easily, I’m in negotiation for the second, and in the meantime have been charged again for the first. This is not helpful for anyone’s mental health.

 

Fundamental questions

Can AI truly replace the human connection that is so essential to therapy? As Alang points out in The Walrus’ excellent piece AI Is A False God (linked below), much of what drives our thoughts, desires, and motivations is rooted in our physical bodies and lived experiences โ€“ things AI simply cannot replicate. A chatbot can process language, but it can’t understand the weight of human emotions, the unspoken nuances of a facial expression, or the comfort of a shared silence. It’s this lack of empathy and genuine understanding that makes many experts sceptical about AI’s ability to provide truly therapeutic care.

So, where does this leave us? AI undoubtedly has the potential to transform mental healthcare, offering tools for research, early intervention, and even assisting human therapists in managing their workloads. Because let’s face it, there simply aren’t enough trained mental health professionals.

But let us please remember that AI is a tool. Trouble is, we don’t actually own or control or understand this tool. Technology serves to enhance, not replace, the human connection at the heart of mental wellbeing. The future of AI in mental health hinges on finding a balance between innovation and ethical considerations, recognising that while AI may be able to analyse our data, it’s the human heart that truly understands the language of the soul.

 

More reading:

AI Is a False God

Anna Butterfield writer and web warrior at Witchdoctor

Anna spends too much time on the internet. A multipassionate writer, mother and maker, yet her most important role remains Slayer of Witchdoctor's Web Woes.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Give a little to support Witchdoctor's quest to save high quality independent journalism. It's easy and painless! Just donate $5 or $10 to our PressPatron account by clicking on the button below.

Authors

WIn a Wiim Ultra Network Music Streamer with Witchdoctor.co.nz
Panasonic Fire TV Be Mesmerised with next gen AI TV
Advance Paris - Designed with French flair. Amplifiers, Streamers, CD players and more www.pqimports.co.nz
Summer wine bubbly fizz drinking review
Previous Story

Wonderful wines and festive fizz

Audio Art Cables AAC IC-3 1m Beyer Amethyst cables
Next Story

Win! Audio Art Interconnect Cables

Latest from AI

Go toTop