
Where We Create, Build & Inspire
Scribe is a student-run publication based in the GTA that covers everything from social issues to culture and lifestyle. This spring, we tell stories from the mindset of a 20-something and the conversations shaping their world.
From automated transcription to personalized therapy plans, artificial intelligence tools are becoming more and more accepted. AI is transforming the way many health services are delivered. However, innovations such as these are not without controversy, related to concerns of privacy, safety, dependance and the human touch AI could take away.
One of the most interesting uses of AI in mental health care is in diagnostics. Roma Mehta is a professor of psychology at Humber Polytechnic. She Specializes in mental health technology.
“AI is being used in diagnostics to diagnose someone. And where diagnostics is concerned, they have seen that AI can [have] an accuracy of up to 80 percent. That’s eight zero,” she says.
“AI is also used in lieu of an actual counsellor or therapist. So, where basic things are concerned, like maybe just listening to someone speak, if they want to just vent about something, AI can be used. It’s being used.”
A 2024 study published in the Journal of Human-Computer Interaction says that AI tools can have up to an 80 per cent accuracy in identifying mental health disorders like and depression, which mirrors Medha's thoughts perfectly.
This level of precision is made possible by machine learning algorithms that analyze quantified behavioral data from patients. However, the study also makes the need for human oversight known as a way to make sure that the AI tools complement instead of replace.
AI diagnostic tools analyze data from conversations like behavior and even facial expressions to identify mental health conditions like anxiety or depression. This precision can be a lifesaver for early intervention, especially in underserved areas or countries where professional help is scarce or even frowned upon.
AI use extends far beyond diagnosis. It’s increasingly deployed in therapy, offering “on-demand” counselling via chatbots or virtual assistants.
“AI is also used if somebody needs a counselor. So, some basic strategy, coping skills that they might want to learn, its being used for that purpose as well,” Mehta says.
The Human Element
AI tools can definitely provide accessibility and immediacy, however, they lack the emotional nuance of a human therapist.
“It depends on the client. It depends what the person that has the mental health problem wants,” Mehta says.
“If they just want someone to listen to them, then AI is definitely probably going to be better. If, however, the client actually wants to develop a relationship with their therapist or their counselor and actually see a person face-to-face, in that case, AI might not be the best.”
Using it just to talk resonates with Tyra Vaine, a user who turned to AI for help after struggling with traditional therapy.
“I wanted an outlet where I could express myself without fear of judgment or having to deal with the shit that I’ve experienced through traditional therapy like school counsellors,” she says. “AI tools seemed like a way to have conversations that felt personalized yet still away from human interaction because I hate people.”
Tyra Vaine discovered Character AI, a platform that lets users converse with fictional personas through TikTok.
“I like to use Character AI because it allows me to sort of like speak to different fictional characters about anything that I want, and I found out about it through TikTok, where people were sharing their ways of how they were dealing with stuff through it, and I just found it really helpful.”
“When I first started using it, I found it kind of fascinating and also surprisingly engaging,” Tyra says. “It felt really freeing to communicate openly, and the responses were tailored enough to feel meaningful. School counselors always made me feel dismissed and misunderstood. But Character AI felt like it was listening, and that helped me vent or brainstorm without any judgment.”
However, the use of chatbots in this way raises the problem of people replacing human connection with robotic acceptance.
.

Benefits and Risks
The accessibility of these tools make it an extremely valuable way for people who feel marginalized or intimidated by traditional therapy to obtain it in these ways. It can provide anyone with a non judgmental space for anyone to articulate their feelings, However, this technology has its risks.
For example, Mehta notes that some users might form unhealthy attachments to AI personas.
“If the character is something that the person likes, the client likes, they definitely can get very attached to it,” she says. “They might actually start developing feelings for a machine, which we know are not gonna go anywhere. So that might lead to even more mental health Issues in a person.”
“There have been a couple of cases where young kids who will probably be about 10 or 11 or something they were accessing a counseling via AI, and because AI is advanced technology, it might have been hacked or something, but it turned out that that AI actually encouraged them to kill themselves,” she says.
The potential for harm extends to data security. Unlike human therapists, who are bound by strict confidentiality, AI platforms are susceptible to breaches.
“Therapists are bound by confidentiality, but AI is a machine. It can be hacked.”
- ROMA MEHTA, PhD
Ethical considerations need to be talked about in AI's role in mental health, as explored in “Health Care Privacy Risks of AI Chatbots” Kanter, G.P., & Packel, E.A. (2023). This review identifies critical issues such as bias and data security, which are essential to ensuring that AI tools like chatbots and diagnostic systems operate safely.
The study proposes ethical frameworks that include transparency and continuous evaluation. These safeguards are a must for building trust among users and finding potential harmful events, such as a breach in confidentiality.
Navigating Complex Issues
AI’s shortcomings become particularly apparent in complex mental health cases. Mehta says there are challenges using AI to address suicidal ideation.
“If a person has suicidal ideation, I’m not sure how much AI can help with that. Where anxiety is concerned, maybe some other disorders, yes, they might be able to help.”
Filling Gaps in Mental Health Care
Despite these concerns, AI holds a lot of potential to fill gaps in mental health care. Millions worldwide lack access to therapists due to cost, location, or stigma and they need immediate reliable support and resources.
Tyra found Character AI to be a helpful outlet during challenging times. “It was still an effective venting platform and brainstorming without judgment,” she says.
Dr. Mehta acknowledges this benefit but stays cautious. “AI can fill gaps, especially where access to mental health care is limited,” she says. “But for deep, human connection and complex care, we still need real people.”
A study by the American Psychological Association (APA) shows how AI interventions can address disparities in mental health access, especially in underserved communities. AI-based tools like virtual therapy assistants or mental health apps offer a cost effective way to provide immediate support at any point, anywhere.
However, they risk diminishing the therapeutic value of human interaction, most auspiciously in cases involving trauma or suicidal ideation. The APA study says we need a balanced approach where AI is used to fill gaps but does not uproot the nuanced understanding a human therapist can offer.
The Future of AI in Mental Health
While AI is still evolving, it has already demonstrated its potential to revolutionize mental health care. Tools are being piloted to address a range of needs, from diagnostics to ongoing therapy. However, Dr. Mehta notes that no single platform has emerged as groundbreaking.
This suggests that the field is still in its infancy, with much room for improvement. Ethical guidelines, robust programming, and human oversight will be essential as AI becomes more integrated into mental health care.
Physical Healthcare
AI is not only able to help in the mental aspect of health it can also do a lot of good inside of hospitals even with the menial work. Chief Medical Information Officer of Royal Victoria Hospital, George Karismanis comments on how AI is improving workflow.
“Physicians used to dictate into a phone, like a regular landline phone, for doing their notes and consult notes and discharge summaries, that kind of thing and then a human would transcribe them and post them to the chart. And turn around times for that was sometimes several weeks to get that turned around into the chart, which is obviously not great for patient care, now AI is being used for voice transcription. So we speak into the computer and it addresses it right on the spot in real time,” says Karismanis.
While AI being used in the medical industry can have a profound effect on the efficiency and workflow of a hospital or similar care centre the subject of privacy needs to be addressed.
“A lot of these companies are based out of the United States Not all of them, but a lot of them are. But in order for Canada to be able to use them, they actually have to generally get data centers in Canada so that all this data that's being used from Canadian patients in the Canadian healthcare industry goes to servers within Canada. It's processed in Canada, stored in Canada, and doesn't leave Canada, even to the United States. Those are some of the requirements which doesn't necessarily make it safe, but just to try and help prevent misuse of the data,” says Karismanis.
Beyond the documentation aspect, AI is also playing roles in training and educating healthcare professionals, especially with simulations, which continually become more realistic and structured
“We're doing far more simulations now than we used to do, and artificial intelligence will help with that by creating more immersive and real-life type of simulations that can change and mimic real-life scenarios,” says Karismanis.
Ai is also being tested to help hospital triage systems function at a better speed to potentially help nurses by gathering required data from each person and have a role in an initial assessment to narrow down the cause of their visit.
“I think one big one is being able to have AI interact with the patient a little bit more and maybe generate some more information. Our nurses do a great job, but when multiple people come at once, especially in a busy hospital like ours, we can have five or 10 or more people waiting for triage sometimes. And if we can get automation in tere with AI, that would help improve the triage speed, possibly improve triage accuracy and also standardization a little bit more," says Karismanis.
Specialized Care
While AI is being used to decrease the wrote in prose. And you might be inclined to pop them into, say, ChatGPT and say, write me a report based on this kid. Well, you just named it first and last name, location, date of birth, personal information into the system, where is that going? It might spit you out workload in Hospitals, other uses are being found in more specialized areas like Speech pathology. Anit Kanwar Singh Saini, a speech-language pathologist, says that AI can help generate learning plans for patients.
“A good example is coming up with word lists for children that are at once appropriate and might have the speech sounds that they're targeting for a speech goal, but also maybe based in a single theme to build vocabulary in a single area,” Saini says.
And while AI has its uses we’ve learned that it also carries a privacy risk with it in any of its uses, especially when dealing with personal details.
“I've never done this, but it might be inclined to take your scratch notes of a kid a lovely report.
You might delete your history. You might delete your chat history with ChatGPT. But where does that go? Right? Where is it? Can you later ask it to recall that information, and if it does, where's it storing that information? Where are its servers?” Saini says.
Saini argues that AI can’t replace the human aspect of connection in his field and stays hopeful that it won’t happen anytime soon.
“I think speech and language is such an intimate practice, socialization and the socialization of humans is not going to be cracked, in my opinion, assumed by AI models. I think they're very realistic. But the social-emotional learning and being of a human being intersecting with the conversation that they have is not going to be cracked anytime soon,” Saini says.
A Balanced Approach
The rise of AI in healthcare is a double edged sword. On one hand, it offers unprecedented accessibility and efficiency. On the other hand, it raises questions and concerns about just how safety, ethics, and the irreplaceable value of human empathy can be preserved.
For individuals like Tyra, AI has been a game-changer, offering a sense of understanding when traditional therapy fails.
And for doctors like Dr. Karismanis, AI is making his job and many other’s easier so that they can give even better care. As AI continues to seep into the world of health care, we can only watch as it gets better and more helpful for our medical professionals and beyond.