April 26, 2026

Wellness Sync

Start the Day with a Smile, Finish with Health

Security behind AI created to help students with mental health issues

Security behind AI created to help students with mental health issues

With students now able to get mental health support by speaking with an AI chatbot, what happens if the problems become too big for the AI to handle?

Plus, with this kind of sensitive information, what does the company need to do to make sure they’re protecting their data?

The AI used by the Utah-based ElizaChat doesn’t take information from the internet in general to offer advice to the students that log on. Their AI gets info from a specific database which is filled with reports from trained counselors and psychologists.

However, what does the AI do when it comes across a student that needs more help than it can provide?

The makers of ElizaChat want to make it clear, it is not meant to replace therapists. It’s designed to help students with smaller problems.

Company CEO David Barney acknowledged the AI has limitations.

“If someone has a major depressive disorder, Eliza is not adequately trained, yet, to handle those types of issues. So, we can help with, maybe, tier one mental health type of issues,” Barney said.

So, what happens when a student says they’re thinking of taking their own life, or that they’re being abused — what does the AI do with that information?

Barney said, “If Johnny from some high school in Utah is talking to the service and there’s an immediate suicide risk, we know Johnny attends that high school, we know who the psychologists and counselors are for that school and we can send emergency messaging to them saying, ‘Johnny is in need of help, right now.’”

That brings up another question. How do they know who “Johnny” is?

“The way it works is you would log into Eliza with your school email. Now, the contents of the conversation are anonymous. Eliza is being built HIPAA compliant,” Barney said.

Attorney Stephen Hester specializes in healthcare law, and he said an AI service like this absolutely can be HIPAA compliant. However, it also has to comply with FERPA, the Family Educational Rights and Privacy Act, which is an extra layer of protection designed to safeguard student information.

Hester said, “It contains very, very strict confidentiality provisions for student information. For example, I have kids in college and I can’t know anything about their class schedules, their finances, or anything, without their consent.”

In counseling, there are times when information has to be released without a patient’s consent. If a student were to report that they were being abused to their counselor, that person is required to report it to the authorities.

Does ElizaChat have to follow the same rules? Hester said the law isn’t as clear as people might expect.

Hester said, “You need to be very specific if you, for example, want to engage in providing healthcare services to children, any kind of healthcare service. You need to be aware of mandatory reporting requirements specific to kids.”

Officials with ElizaChat said even though the law doesn’t require them to report cases of abuse or suicidal thoughts, they’re going to act as if they were a mandated reporter.

On top of all that, they have to make sure those chats stay private.

XMission Internet Provider founder Pete Ashdown said, “Anything that’s online has security questions.”

According to Ashdown, any company offering mental health services needs to make sure their encryption is safe, their firewalls are strong and they have to look out for potential holes in their software updates. Hackers have ways to find new security flaws and exploit them after these updates happen.

“There is always a window of opportunity there, and what hackers use is they have a script that goes through all of the possible holes in somebody’s system to see what works and what doesn’t,” he said.

___

link