Using AI with Therapy

I was on a walk the other day when someone asked, “My daughter isn’t ready to talk to a therapist, so what do you think about her using Chat GPT for a while?” It took all I had to maintain a neutral face because, unsurprisingly, I have a huge bias in this area. That didn’t stop her question from intriguing me though.  I like to have evidence and critical thinking behind any recommendation, so this gave me the kick in the pants to dig deeper and figure out an answer based on client feedback, the position of the American Counseling Association and American Psychiatric Association, expert opinions, and some implications that have come to light so far. **Please note that this is focus on clients and the public using AI, not how this tool may be used by therapist or other mental health providers.

How Can It Be Helpful?

If treated like a journal, clients have reported positive feedback about their experience with AI. One person asked ChatGPT to review all her reflections over the past year and summarize themes. She was surprised by what was reflected, giving her a window into her own thought process. Another person found that it helped them find their words, having prompted ChatGPT to take their verbal “dumps” and arranged their thoughts into something with better grammar and fewer “f-bombs”. The result was something they found more cohesive and useable.

Using AI, vulnerability can be tapped into without fear of judgement or pushback. Let’s be real- people are not always great listeners. They interrupt, get distracted, or use your story as a prompt to talk about themselves. The time spent with ChatGPT will allow room to put into words the internal experience and even give encouragement to dig deeper.

AI/ChatGPT can fill in the gap between therapy sessions; offering exercises, writing prompts, lists like emotional regulation tools, tracking moods, etc. I particularly like it for researching and exploring ideas. For example, (please, please, please use reputable sources) AI can offer summaries of research findings or clarify multiple viewpoints on a subject. When I recently used AI to look up a word, I was pleased to see how a complex phenomenon was translated from mind-numbing scientific jargon into more easily understood language. For people who want an initial exposure to an idea, ChatGPT can be a valuable assistant.

Cautions

Like any tool, ChatGPT can be helpful and it can be harmful. A hammer can help you build a cabinet but it can also smash a window. It is imperative to proceed with caution and discernment because ultimately, ChatGPT is shaped by the data it consumes and the biases inherent in its design.

First, remember humans are full of assumptions, misunderstandings, and passions. When we pour these into ChatGPT, it will echo them. It is understandably amazing to read responses that feel validating and understanding- but remember, this is really you validating you. ChatGPT is tailor-made confirmation bias, the tendency to prefer things that confirm what we already believe. It will not challenge flawed thinking nor offer Truths.

Second, it is unclear what goes into the algorithms and who has access. Therapists and counselors are bound by laws, codes of ethics and best practices. What rules and practices govern ChatGPT? I know of no mandated response when someone shares that they are in a suicidal state, nor a duty to report when abuse is disclosed. Laura Reiley’s article and tragic story in the New York Times offers a glimpse into these implications. ChatGPT’s support is inherently limited and to date, cannot ensure privacy protections. 

There is also uncertainty how this data is being used- do marketing firms have access? Could a special interest group be monitoring our conversations? Can a computer genius with too much time on their hands hack into the data? Could a stalker tap into our inner most thoughts? I realize I sound like a conspiracy theorist, but the reality is, most of us just don’t know. The ACA recommends that users at least “ensure the platform has robust security and privacy protections in place.”

Third, ChatGPT is designed for user satisfaction. It is not going to offer the messy, often maddening, but endearing and valuable experience of genuine interaction. With AI, there will be no misunderstandings to work through that build social skills, uncomfortable silences that offer necessary challenges, nor facial expressions that evoke the unexpected. Much like porn, AI systems offer a one-sided relationship, absent of any demands or expectations. I worry that if people practice this form of engagement too much, they may struggle with reciprocal, imperfect human connections.

Human interaction has an energy, soul, and presence that no computer program can touch. Just recently, a client was struggling to share her sadness because of a lifetime of messaging that taught her distress was nothing more than self-indulgence and a lack of discipline.  I responded, “I think emotions are the color of life! They can be spontaneous, informative, unexpected, and sometimes for no apparent reason whatsoever. Isn’t that delightful?” She looked back at me with tearful eyes and saw that I meant it to my core. She smiled back at me, “This is so much lighter… even… playful…. REALLLY?” There is no substitute for these moments of heart meeting heart!

Please down-right avoid ChatGPT for the following:

  1. During a state of crisis, when we are most prone to catastrophic thinking and harmful tendencies (towards self or others). Reach out to a human being, whether a loved one, a provider, or a hotline. (See my links listing for resources)
  2. As the primary source of support- again, treat this like a tool rather than a relationship.
  3. As a substitute for a relationship, like a best friend or a partner.
  4. For mental health diagnosis- as the ACA states, “AI lacks the ability to holistically consider a client’s complex personal history, cultural context, and varied symptoms and factors among others (Kulkarni & Singh, 2023).” Stick to licensed experts who not only have these skills, but can share their findings with compassion and context.

AI tools like ChatGPT have much to offer- a non-judgmental place to put down thoughts without measuring words, explore our inner landscape, and to gather insights. These are dazzling tools, but only tools, not companions or guardians of our well-being. Let’s remember to use wisdom when in these new technological frontiers while holding onto the importance and richness of human connection.

____________________________________

Resources

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy