In the era of artificial intelligence, ChatGPT and other large language models (LLMs) have quickly become more than just a tool for homework and writing essays. It’s become a space for emotional support capable of offering mental health advice — almost akin to a therapist, it can offer steps to solve personal problems and situations in place of a licensed professional.
At first glance, using ChatGPT as your therapist does seem appealing. If you are logged-in to the AI tool, it keeps all the questions you’ve asked in its memory and can build off of prior chats with you, tailoring its answers to fit your profile. It’s also completely free and highly accessible, available to anyone with a click of their mouse. It offers a judgement-free zone from the comfort of your home, taking away the need for zoom calls, phone calls, or any commute. You just have to type in your situation, and you’re in therapy — no chaise lounge required.
Despite the ease of using ChatGPT, there are caveats to this therapy setup. Unlike therapists, ChatGPT was not explicitly trained to provide clinical care to individuals. It does not have the feelings, intuition, or the overall human experience a real therapist would have. ChatGPT is simply an active listener — a program that processes inputted information and uses a composite of training data and web resources to reason a response which temporarily appeals to an individual’s emotions, rather than taking into account their psychological needs. In the long run, however, it does not have the license and credibility to provide treatment plans and fully understand the depth of the human experience. This fundamental lack of humanity and appropriate emotional resonance are what distinguish ChatGPT the most from human therapy.
Grief, love, cultural identity, body language, and the tone of one’s voice — abstractions difficult for a language model to communicate and perceive — say a lot about an individual and their mental state. A trained therapist who draws on not just years of education and clinical experience, but also human experience of their own, trauma, and happiness, is able to build authentic empathy and foster a real connection with their patients. That connection, in turn, makes a therapy session successful. With mutual vulnerability, you build a bond of trust, unlike with AI tools, which sycophantically spit back what you tell them and offer superficial insights.
Beyond its inability to deeply bond with a patient, lack of privacy is another weakness of language models.. The risk of data leakage is always present with online tools such as ChatGPT, as chats can be reviewed by developers and are commonly used to improve AI models. Sharing confidential information with ChatGPT comes with serious implications — while therapists are bound by HIPAA and can get their license revoked for revealing details of sessions, AI platforms operate under broader and vaguer terms of privacy, creating a risky gray-area for users.
At a time where data privacy is more fraught than ever, we should be careful with feeding confidential information about personal mental health and vulnerability into ChatGPT. While it is tempting to vent to ChatGPT due to its accessibility and convenience, it is important to understand the drawbacks and distinctions between seeking digital companionship and receiving clinical guidance.
ChatGPT might work for quick inquiries and short-term coping strategies at the risk of confidentiality. However, as reliable and long-term expert help, it falls significantly short. Fortunately, alternative resources are available. For students, the University offers free therapy through the University Counseling Center (UCC). No matter the complexity of the situation, AI is not a comparable substitute for friends, family, and loved ones. Authentic human connection will always be stronger than the superficial imitation of empathy and companionship AI provides.
