Get help now Make a donation

Mental Health in the Age of AI commission

AI tools are becoming a part of everyday life. Some people now use them to look for help with their mental health. But sometimes these tools give advice that is wrong or unsafe. We've launched a new Commission to learn how AI is being used, what’s working, and what could cause harm.

What is the Mental Health in the Age of AI commission?

The commission is an activity that brings people together to talk about how AI can affect mental health, including its challenges and benefits.
It is led by a steering group and includes activities where members of the public and people with personal experience can share their views and evidence.

The Commission will:

  • Bring together people with lived experience, experts and decision‑makers
  • Look at how AI is already being used for mental health
  • Find out where it helps, and where it causes harm
  • Share what we learn in clear reports
  • Create trusted guidance for the public

Why are we doing this now?

More people are telling us that they’ve been confused, upset or harmed by advice from AI tools.

Some examples include:

  • People receiving wrong or risky information about serious conditions
  • People turning to AI because they feel they can’t talk to anyone else
  • People forming emotional or “therapy‑like” relationships with AI tools not designed to give mental health support

AI is becoming common very quickly. We want to make sure people get information that is safe, clear, and based on real evidence.

Why Mind?

Mind has supported people with mental health problems for many years. We listen to people's real experiences and focus on what is safe and right for them. We’re independent, not part of government or a tech company, which means we can ask the questions that really matter.

What happens next?

Over the next year, the Commission will share updates, reports and clear guidance to help people use AI safely. We’ll also hold events and speak with experts, communities and decision‑makers.

Getting support

AI can sometimes create things that are untrue, dangerous, illegal or scary. If you need help with something you have seen, Mind's safeguarding team can help. 

Email the safeguarding team

Our advisors are available 9-6pm every weekday.

I need help right now

Mind's safeguarding team is not a crisis helpline.  If you need urgent help, Mind can support you.

Urgent help

Get involved

We'd love to know how you're using AI, if it's been helpful, or if it has given you bad advice.

Soon, we'll be asking people to share their experiences with us. Check back soon for more info on how to get involved.

For journalists and the media

If you are writing a story about AI and mental health, Mind is here to help. Our spokespeople can talk about the Commission and explain our concerns around AI and mental health. Visit our media centre to find the right person to talk to.

Mind's media centre

Mind's policies

Learn about more of Mind's policies and what we campaign on.

Our policy work

arrow_upwardBack to Top