Circular conversations: the hidden cost of talking to AI about your mental health
It is increasingly common for clients to come to us discussing how they have been talking to generative AI, such as ChatGPT, about their problems. They may have spent months having circular conversations, only to find their anxiety is worse.
In this article, I will explore the benefits and drawbacks of talking to AI about your difficulties. We will look at why it might be tempting, but also why it may be maintaining unhelpful patterns.
Why talk to AI?
When we talk about AI, we are generally referring to generative AI. This means chatbots such as ChatGPT, Gemini, Claude and Copilot. There are many other uses of AI, but in this article, we are talking specifically about what is known as large language models (LLMs) such as these.
Generative AI has become a household technology in recent years. In many instances, it has replaced searching for things on search engines such as Google. There are several reasons why it is tempting to use the technology as a therapist.
First, it is accessible at any time. You can use AI 24/7, when a human therapist might not be available. This might be especially appealing if you are experiencing panic attacks or bouts of OCD that come and go.
Second, it is also much cheaper. Therapy can be expensive or involve sitting on long waiting lists for NHS treatment. Generative AI is available for free or for a small subscription fee.
Third, it may be easier to open up to a non-human. Many of us carry a lot of shame about our mental health problems. This is especially true for individuals with intrusive thoughts around harming others or moral OCD.
What are the drawbacks?
It may seem obvious, but one of the major drawbacks is that there is no human on the other end of the line. What might be less obvious is why this will have a significant impact.
Multiple research papers have suggested that the therapeutic relationship is more important than any specific techniques (Norcross & Wampold, 2019; Baier, Kline, & Feeny, 2020). The magic in therapy is the relationship between you and your therapist. This does not exist with AI.
Validation loops
As a therapist, I am constantly considering whether this is the right time to support someone or challenge them appropriately.
Let's take health anxiety as an example. Many people with health anxiety constantly seek reassurance that they are not ill. This makes them feel better in the short term, but maintains their anxiety in the long term.
In the therapy room, I gently help my clients to change the way they engage in their problematic thoughts. AI won't do this. It will get you to engage with these thoughts and offer you short-term validation. This may feel good at the time, but it also maintains these unhelpful patterns.
Clients sometimes come to me after months of talking to AI, telling me about all of the rabbit holes they have gone down, and how it only seems to have made them worse. Unfortunately, this is not surprising, because AI will often tell you what you want to hear, not what you need to hear.
Chatting vs therapy
I often warn my clients at the start of therapy that it is not a fun process. Healing emotional pain typically requires exposing yourself to distress. We do this collaboratively, when you feel ready, but we do it.
Good therapy is not venting to a friend, and it is not always having a comfortable conversation. AI allows you to talk about your problems. But it will not facilitate you having those difficult conversations. Nor will it help you process what you are saying.
Without this processing, nothing moves forward. You may find yourself having the same conversation repeatedly because there is no real engagement in the emotional content of your experiences.
AI will help you think about your difficulties at length. But typically, one of the things that maintains anxiety is that there is too much thinking and not enough emotional processing going on. We use overthinking to avoid our difficult feelings. Therapy helps you approach these difficult feelings carefully. AI will simply collude with you in the overthinking.
What about privacy?
Sometimes, people may struggle to open up to their therapist because they are worried about privacy. They may think their thoughts are so dangerous that the therapist would break confidentiality. It may feel easier to talk about these issues with a computer.
In reality, speaking to a professional therapist offers stronger safeguards than talking to an AI.
AI companies often train their models on the conversations people have with them. While some offer incognito modes, these are also typically logged for 30 days. It is possible that either your account or their systems could be compromised. We have already seen instances where bugs in ChatGPT have leaked users' conversation histories.
When talking to a therapist, you benefit from strong privacy protection. We are explicit about when we will and will not break confidentiality. And we have to answer to our professional bodies if we break confidentiality inappropriately.
While talking to a computer may seem more private, therapists offer robust protection as part of our profession, and AI companies do not.
The dangers of incorrect narratives
Before clients ever sit down with a therapist, many have already spent months developing a story about themselves and their difficulties. When that narrative has been shaped by AI, it can create a significant obstacle to effective therapy.
Generative AI has no way of knowing which psychological model is appropriate for your difficulties. It will draw on whatever frameworks it has encountered in its training data, which may include evidence-based approaches, but also a great deal of pseudoscience, pop psychology, and outdated thinking. The result is that clients sometimes arrive in therapy having been given an explanation for their problems that is not only unhelpful but actively wrong.
This matters because unlearning a narrative is harder than building one from scratch. If you have spent months being told, and coming to believe, that your anxiety is the result of a particular cause or mechanism, it takes time and effort to revisit that.
There is also the question of what AI tells clients to expect from therapy itself. Clients sometimes use AI to research whether their treatment is likely to work, and AI can fuel doubt and pessimism in ways that directly undermine the process. This is particularly damaging because research consistently shows that a client's belief that therapy will help them is itself one of the ingredients that makes it work (Constantino et al, 2011).
Summary
Reaching out to AI for your mental health can seem very tempting. If you are struggling, it is completely understandable that you would reach out for any support available.
However, it is also important to consider the drawbacks. AI typically tells you what you want to hear, which may feel good in the short term, but often maintains anxiety and OCD symptoms in the long term. This often results in what we would call anti-therapy.
If you want to make a meaningful change in your life, speak to a professional therapist. The human connection, clinical judgement, and strong ethical guidelines will make all the difference.
References
Baier, A. L., Kline, A. C., & Feeny, N. C. (2020). Therapeutic alliance as a mediator of change: A systematic review and evaluation of research. Clinical psychology review, 82, 101921.
Constantino, M. J., Arnkoff, D. B., Glass, C. R., Ametrano, R. M., & Smith, J. Z. (2011). Expectations. Journal of Clinical Psychology, 67(2), 184–192. https://doi.org/10.1002/jclp.20754
Norcross, J. C., & Wampold, B. E. (2019). A new therapy for each patient: Evidence-based relationships and responsiveness. Journal of Clinical Psychology, 75(11), 1932–1940. https://pubmed.ncbi.nlm.nih.gov/30334258/
