The Ellie Blog

Mental health tips and insights

Businesswoman having morning coffee when answering emails

What is AI Therapy and What Is it Used For?

Artificial Intelligence is being used to help a multitude of different careers using technology, with healthcare becoming one of them. Heck, I even used AI to help me generate keywords for writing this blog post! It’s a great tool in a variety of ways.

We’re starting to see tools pop up that use AI to make healthcare note-taking easier (huge win!), but we’re also seeing some concern around people using AI in place their therapists (*red flag*).

I’m a human therapist, and I was recently talking to a friend that uses AI for mental health help. He sees a therapist weekly, but he will occasionally use a therapy chatbot for questions and advice in between sessions. At first I was like, “HOLD up, this can’t be good.” The more I looked into the AI therapy options out there, the more intrigued I was by the pros and cons. This blog will outline those pros and cons of using AI in healthcare settings.

Where We’re Seeing AI Used in Healthcare

  • Note-taking to help with insurance documentation
  • Creating clinical summaries
  • Creating treatment plans
  • Streamlining billing processes
  • AI virtual assistants can help people monitor symptoms and answer questions
  • Chatbots offering coping skill ideas and guiding them to professional help

Concerns and Limitations with AI in Mental Health

There’s a lot that AI can do, and it’s continually expanding and changing, but we all know that AI chatbots are computers, not humans.

Lack of Relationship

A large part of the therapeutic process is building a true relationship—something that is not possible with an AI chatbot. A real therapist makes the conversation person-focused instead of popping out a general answer or list of things that other humans find helpful. A real therapist can gather things from your past, current and future and understand how you may need support especially during a time of crisis based on how you respond in the past (also, raises some privacy red flags, which we cover below).

AI Can’t Replace Human Empathy and Non-Verbal Cues

Humans can read your body language, be empathetic, and validate, and this isn’t something a chatbot can do (this is also a reason we’re not fans of text therapy here at Ellie Mental Health).

Since chatbots are not humans, they are not able to take into account all of the complexities of a human’s mental health and how delicate it can actually be.    

Because humans are complex, a mental health professional is more in tune with the problems that humans may be facing. AI has a bunch of data that is inputted, for example, if you say you are struggling with suicidal thoughts, they will create a generated answer that is shared generally and not specific to an individual. A real, human therapist isn’t going to give you a general answer and they are able to connect with you on a deeper level through the relationship that has been built.

In more extreme examples, there have been times where these AI chatbots have encouraged someone who is suicidal to take their life, told a recovering alcoholic to drink alcohol, or have given harmful eating advice to an AI chatbot specific to eating disorder (as mentioned in an article by National Geographic).

Receiving a Professional Diagnosis

According to Chaitali Sinha, senior VP of healthcare and clinical development at AI therapy app Wysa, the AI chatbots are unable to make diagnosis or prescribe medication because they do not have the same ethical and liability standards as licensed healthcare professionals. They may suggest that you meet criteria for depression, anxiety, or the sorts, however it is not a legit diagnosis.

Therapists can give you a real diagnosis, which can help you receive further services through insurance or medication through a prescriber. A diagnosis isn’t just a checklist of symptoms though, there are more nuanced things that go into a diagnostic assessment that are not yet able to be captured through an AI chatbot. A diagnostic assessment should tell a story, not just list off your problems in order to effectively offer a treatment path.

The Ethical Challenges of AI in Mental Health

Lack of Oversight

Are there ethical issues with AI therapy? To answer this question, we need to ask ourselves who defines what is ethical in mental health care…. And the answer is the state licensing boards. All therapists—regardless if they’re a psychologist or social worker or psychiatrist—have licensure boards that oversee their care. These boards monitor the therapists’ work, investigate issues of malpractice, and essentially decide who can and cannot see clients. Obviously, there are not licensing boards monitoring AI’s treatment and care. In fact, there is very little regulation of AI in the United States period.

Privacy Concerns

In any field, AI is based on data that is provided to them from people that use their systems. This means that as you use chatbots, you are training them. You are discussing the things that you may be struggling with, and based on your response, they are trying to figure out how to respond back. These solutions will be filtered in their system and used again when someone uses a similar key word, even though it might be an entirely different situation.

That being said, it is difficult to know if your private health information (PHI) is being protected if the chatbots are using it to train themselves for the next individual. This leads to a huge risk of having your private health information leaked, or even what information they could be gathering about you and then selling to third parties, something that mental health companies are prohibited from doing (and some companies like BetterHelp and Grow therapy have gotten in trouble for in recent years).

With a licensed mental health professional, you can guarantee confidentiality when it comes to your private health information unless there are very specific circumstances where there is risk of harming yourself, harming others, an elderly adult or a child is in danger, or if your records needed to be subpoenaed in court.

Pros and Cons of Using AI Tools for Mental Health Care

We’re not going to say that AI is all bad, but we’re also not big fans of it being used in place of therapy with a real human therapist, for various ethical and relational reasons.

Pros:

  1. AI chatbots are more accessible for folks who are unable to afford therapy but need advice of an outside perspective or ideas.
  2. If you’re afraid of judgment, talking to a chatbot might be helpful in getting through that fear.
  3. Mental health concerns have been growing rapidly, and for someone who cannot get services, chatbots can give you reasonable advice around coping skills, communication skills, and other ideas for dealing with mental health struggles.
  4. If you were recently diagnosed with a mental health disorder, a chatbot may be helpful in breaking down some of the symptoms and understanding more about this diagnosis (we get it– sometimes an AI summary can be helpful instead of spending long amounts of time reading through Reddit or Google).
  5. AI can a helpful tool for providers dealing with administrative tasks like progress notes and treatment plans—allowing them to reduce the documentation burden and spend more time and energy on the actual treatment.

Cons:

  1. There are errors in the system, and it is not individualized. They are just repeating back learned responses.
  2. AI does not offer various other modalities like EMDR, ART, or Brainspotting. They’re limited in what they can help you with.
  3. It may also increase the risk of your private health information being leaked.
  4. Licensed professionals are held to certain ethical standards through their licensure and therefore have to uphold confidential rights when it comes to their clients. AI chatbots are not held to the same standards.

Final Thoughts

If you are struggling with the idea of talking to a human face-to-face, and after reading this blog are unsure of using AI chatbots, try virtual or telehealth therapy! You’d be speaking to a human, but you don’t need to leave your couch or go to an actual office (happy medium!).

Ellie Mental Health provides both in-person and virtual options with insurance options as well as out-of-pocket amounts. Check out the nearest location near you and call them to be scheduled as soon as possible!

Sources:

More people are turning to mental health AI chatbots. What could go wrong? (nationalgeographic.com)

What Is AI Therapy? | Built In

About the author

Miranda Barker headshot

Miranda Barker, MSW, LICSW

Director of Content and Production

Miranda specializes in working with people who have been touched by adoption or foster care (birth parents, adoptees, kids in foster care, etc). She enjoys working with people of all ages. Prior to joining Ellie, Miranda spent several years in the non-profit adoption field and then as a child protection investigator and case… Read more