AI in Mental Health

AI in Mental Health

*(Please see bottom of article for free Mental Health Hotline Numbers available to help you, if you are in need)*

AI, AI, AI…its all anyone in the tech world can seem to talk about. But, its not just affecting technology these days. It is now creeping its way into the Mental Health profession in some very real ways.

An AI Chatbot is an interactive program hosting a two-way text or voice conversation. Yuck, I hate using technical jargon. To put it simply, an AI is a computer program that you can “chat” with and it pretends to be a person.

Think of it like an assistant or Intern. Using these AI’s would allow people to receive services like Cognitive Behavioral Therapy or journaling therapy from the comfort of their own homes or offices.

AIs like WoeBot or Tess teach therapeutic skills that people can apply on their own, similar to the skills therapists would teach their patients.

AI’s also allow therapists to automate things like early screening, patient access to treatments, and offer a better quality of work like for clinicians. I mean, they can even analyze the word choices of patients that a therapist might miss and point that out and suggest a coarse of action. They are getting more intuitive every day.

Most clinicians agree that there is a shortage of human therapists and counselors. Plus, since they’re human, they aren’t available 24/7. Accessibility to quality clinicians can also be a challenge for some patients, based on their remote locations.

It seems like all the benefits of AI to the mental health world are really stacking up. I mean, studies are starting to show that some patients are just more willing to interact with and open up to an AI. And since AIs are becoming conversational, they can put a patient in the mind frame similar to the one they engage when having an actual conversation with a therapist. Patients feel less stigma in asking for help, knowing there’s no human at the other end.

Supporters of chatbot therapy say the approach may also be the only realistic and affordable way to address a gaping worldwide need for more mental health care, at a time when there are simply not enough professionals to help all the people who could benefit.

But there are just as many who are against the further integration of AI into a ‘very human place’. They know that AI’s often make mistakes of there own as they are not fully capable of understanding the complicated machine that it the human mind.

Observation of interactions between human patients and the AIs also show a weakness in the machines. Often times, careful word choices will allow the human to ‘deceive’ the machines in ways that their human counterparts would not fall victim.

Regardless of where you stand on the line it is important to note that current Mental Health AIs are not meant for crisis intervention. If you need help, ask for it. You can call the one of several hotlines found with a simple Bing or Google search.

“Every Month is Mental Health Month”

Suicide Prevention: or Dial 988

National Drug Helpline: or call 844-289-0879

*Most states also have: mh.”your state”.gov (example: this site would provide a list of local hotline numbers for your use.