A study from Stanford University found that those struggling with mental health concerns shouldn’t replace a human therapist with an artificial intelligence therapist.

According to the study’s authors, many young people are telling doctors that they are just as, if not more, comfortable discussing mental health concerns with a chatbot. That led to the start of the study.

However, the study pointed out that ChatGPT and other AIs want to give people answers to things it thinks they are looking for. If someone is looking for ways to commit suicide, even if they are just discussing it, the AI might recommend ways to commit the act.

Scott Budman has the full report in the video above