Artificial intelligence (AI) is everywhere. From creative tools to customer service, it seems like there’s an AI app for everything, including mental health support. People struggling are turning to chatbots for support.
Across Canada, wait times for mental health care remain long, and services can be difficult to access. For anyone navigating a crisis in this landscape, it may feel easier to type into a chatbot. AI is non-judgmental, available at any hour, and affordable. It also decreases barriers for those who do not feel ready to talk to a person (Psychology Today).
In fact, when used correctly, AI can help:
- Share general information and coping strategies such a breathing and grounding exercises
- Direct to important resources such as crisis lines
- Listen and provide comfort
However, AI cannot provide the human perspective and informed experience needed for crisis support, resulting in sometimes giving unsafe advice.
Limits of AI as Therapy
Researchers are beginning to uncover the risks of relying on AI for mental health. A Stanford study found that therapy chatbots may give shallow or even harmful responses, sometimes missing signs of crisis altogether. The American Psychological Association warns that these tools may not reliably recognize suicidal thoughts or psychosis.
Unlike a trained crisis responder or counsellor, chatbots are unable to understand the nuance behind these important conversations. For example, there’s reports online of chatbots allegedly worsening states of psychosis by incorrectly validating unsafe thought patterns (Psychology Today) or unintentionally encouraging people to follow through with suicidal ideation (CBC News).
These risks matter because they touch on the very reasons people reach out in the first place. AI, if used incorrectly, can:
- Fail to notice warning signs of suicide or psychosis.
- Potentially provide inappropriate advice
- Validate unsafe thoughts and actions
Why Human Support Matters
When someone reaches out in crisis, they need more than quick answers. They need to feel heard, understood, and supported in a way that fits their situation. That is what Distress Centre Calgary provides. Our services are free, available 24/7, and every call, chat, and text is answered by a trained responder who is here because they want to help.
Our team combines compassion with specialized training, which means they can:
- Assess safety and recognize warning signs
- Adapt support to fit each person’s unique situation
- Listen with empathy and without judgment
- Connect people to local resources and tools that can help beyond the crisis
Researchers in Alberta echo the importance of this kind of response. The Centre for Suicide Prevention highlights how informed, skilled supports reduce suicide risk.
In 2024, more than 164,000 people reached out to our crisis line responders through phone, text, chat, and 211. Each one was met by a responder who offered care, patience, and practical support. This blend of empathy, training, and trusted connection is something AI cannot replicate.
Immediate Human Support Is Always Here
AI is powerful technology, but people provide real human support, warmth, and empathy.
If you are struggling or worried about someone you know, support is available 24/7.
Call our 24-Hour Crisis Line at 403-266-HELP (4357)
- Text us at 403-266-4357
- Chat with us online at distresscentre.com
You can also call or text 988 to connect with Canada’s Suicide Crisis Helpline and speak with a trained responder.
If you are a youth, you can contact ConnecTeen by calling 403-264-TEEN (8336), texting 587-333-2724, or chatting online at calgaryconnecteen.com.
For information and referrals to community, social, health and government services, call or text 211, or visit ab.211.ca.
You are not alone. Reaching out for help is a powerful step, and support is always here.