Skip to main content Skip to main navigation

AI chatbots could help reduce mental health stigma

AI chatbots may help reduce mental health stigma, particularly for people hesitant to seek traditional face-to-face support.

Woman sitting in front of a laptop. Stigma is a major barrier to seeking mental health help.

New Edith Cowan University (ECU) research suggests artificial intelligence chatbots like ChatGPT may help reduce mental health stigma, particularly for people hesitant to seek traditional face-to-face support.

The study, As Effective as You Perceive It: The Relationship Between ChatGPT's Perceived Effectiveness and Mental Health Stigma, led by ECU Master of Clinical Psychology student Scott Hannah, with supervision from Professor Joanne Dickson, is one of the first to examine how using ChatGPT for mental health concerns relates to stigma.

Researchers surveyed 73 people who had used ChatGPT for personal mental health support, investigating ChatGPT use and its perceived effectiveness related to stigma.

"The findings suggest that believing the tool is effective plays an important role in reducing concerns about external judgment," Mr Hannah said.

Understanding stigma

Stigma is a major barrier to seeking mental health help. It can worsen symptoms and discourage people from accessing support.

The study focused on two forms of stigma:

  • Anticipated stigma — fear of being judged or discriminated against
  • Self-stigma — internalising negative stereotypes, which reduces confidence and help-seeking

Key findings

People who felt ChatGPT was effective were:

  • more likely to use it
  • more likely to report reduced anticipated stigma, meaning less fear of being judged

AI mental health use on the rise

As AI tools become more common, people are using chatbots for private, anonymous conversations about their mental health concerns.

"From a sample of almost 400 participants in this study, almost 20 per cent were engaging or had already engaged with ChatGPT for mental health purposes, and almost 30 per cent were open to the idea if faced with a mental health difficulty," Mr Hannah said.

"These results suggest that, despite not being designed for these purposes, AI tools such as ChatGPT are becoming more widely used for mental health purposes."

It may be easier to open up to AI, but be wary

Mr Hannah said anonymous digital tools may offer early support to those reluctant to seek help.

"Many people still worry about being judged for struggling with their mental health," he said.

"When people feel ChatGPT is helpful, it may ease some of that fear and encourage them to open up.

"However, there are important ethical considerations, as ChatGPT was not designed for therapeutic purposes, and recent research has shown that its responses can sometimes be inappropriate or inaccurate. Therefore, we encourage users to engage with AI-based mental health tools critically and responsibly."

Professor Dickson said AI may provide an accessible bridge for people facing stigma-related barriers.

"AI isn't a replacement for professional care, but perceptions of support can help reduce stigma," she said.

More research needed

Professor Dickson said more work is required to understand how AI can safely complement mental health services.

"As AI grows, it's crucial we understand its impact so we can guide best practice," Professor Dickson said.

The study As Effective as You Perceive It: The Relationship Between ChatGPT’s Perceived Effectiveness and Mental Health Stigma was published in the journal Behavioural Sciences.


Featuring:

Media contacts

For all queries from journalists, official statements from the University or to speak to one of our subject matter experts, please contact our Corporate Relations team.

Telephone: +61 8 6304 2222
Email: pr@ecu.edu.au
Social: follow us on X

Related articles

Explore ECU Newsroom