{"id":263340,"date":"2025-09-29T08:25:12","date_gmt":"2025-09-29T08:25:12","guid":{"rendered":"https:\/\/www.europesays.com\/us\/263340\/"},"modified":"2025-09-29T08:25:12","modified_gmt":"2025-09-29T08:25:12","slug":"states-struggle-with-ai-therapy-app-rules-amid-mental-health-needs","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/263340\/","title":{"rendered":"States struggle with AI therapy app rules amid mental health needs"},"content":{"rendered":"<p>In the absence of stronger federal regulation, some states have begun regulating apps that offer AI \u201ctherapy\u201d as more people turn to <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/apnews.com\/article\/ai-chatbots-selfharm-chatgpt-claude-gemini-da00880b1e1577ac332ab1752e41225b\" target=\"_blank\" rel=\"noopener\">artificial intelligence for mental health advice<\/a>. <\/p>\n<p>But the laws, all passed this year, don\u2019t fully address the fast-changing landscape of AI software development. And app developers, policymakers and mental health advocates say the resulting patchwork of state laws isn\u2019t enough to protect users or hold the creators of harmful technology accountable.<\/p>\n<p>\u201cThe reality is millions of people are using these tools and they\u2019re not going back,\u201d said Karin Andrea Stephan, CEO and co-founder of the mental health chatbot app Earkick.<\/p>\n<p>___<\/p>\n<p>EDITOR\u2019S NOTE \u2014 This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988. There is also an online chat at 988lifeline.org.<\/p>\n<p>___<\/p>\n<p>The state laws take different approaches. <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/idfpr.illinois.gov\/content\/dam\/soi\/en\/web\/idfpr\/news\/2025\/2025-08-04-idfpr-press-release-hb1806.pdf\" target=\"_blank\" rel=\"noopener\">Illinois<\/a> and <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.leg.state.nv.us\/Session\/83rd2025\/Bills\/AB\/AB406_EN.pdf\" target=\"_blank\" rel=\"noopener\">Nevada<\/a> have banned the use of AI to treat mental health. <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/le.utah.gov\/~2025\/bills\/static\/HB0452.html\" target=\"_blank\" rel=\"noopener\">Utah<\/a> placed certain limits on therapy chatbots, including requiring them to protect users\u2019 health information and to clearly disclose that the chatbot isn\u2019t human. <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.palegis.us\/legislation\/bills\/text\/PDF\/2025\/0\/SB0631\/PN0635\" target=\"_blank\" rel=\"noopener\">Pennsylvania<\/a>, <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.njleg.state.nj.us\/bill-search\/2024\/A5603\" target=\"_blank\" rel=\"noopener\">New Jersey<\/a> and <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/leginfo.legislature.ca.gov\/faces\/billTextClient.xhtml?bill_id=202520260SB579\" target=\"_blank\" rel=\"noopener\">California<\/a> are also considering ways to regulate AI therapy.<\/p>\n<p>The impact on users varies. Some apps have blocked access in states with bans. Others say they\u2019re making no changes as they wait for more legal clarity. <\/p>\n<p>And many of the laws don\u2019t cover <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/apnews.com\/article\/openai-chatgpt-california-delaware-ags-3b035de96e74c6839aa12143e2225cf9\" target=\"_blank\" rel=\"noopener\">generic chatbots like ChatGPT<\/a>, which are not explicitly marketed for therapy but are used by an untold number of people for it. Those bots have attracted lawsuits in horrific instances where users <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.nature.com\/articles\/d41586-025-03020-9#:~:text=It%20seems%20that%20some%20people,that%20information%E2%80%9D%2C%20says%20Seymour.\" target=\"_blank\" rel=\"noopener\">lost their grip on reality<\/a> or <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/apnews.com\/article\/chatgpt-study-harmful-advice-teens-c569cddf28f1f33b36c692428c2191d4\" target=\"_blank\" rel=\"noopener\">took their own lives<\/a> after interacting with them.<\/p>\n<p>Vaile Wright, who oversees health care innovation at the American Psychological Association, agreed that the apps could fill a need, noting a nationwide <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.apa.org\/monitor\/2024\/01\/trends-pathways-access-mental-health-care\" target=\"_blank\" rel=\"noopener\">shortage of mental health providers<\/a>, high costs for care and uneven access for insured patients.<\/p>\n<p>Mental health chatbots that are rooted in science, created with expert input and monitored by humans could change the landscape, Wright said.<\/p>\n<p>\u201cThis could be something that helps people before they get to crisis,\u201d she said. \u201cThat\u2019s not what\u2019s on the commercial market currently.\u201d<\/p>\n<p>That\u2019s why federal regulation and oversight is needed, she said.<\/p>\n<p>Earlier this month, the Federal Trade Commission announced it was <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.ftc.gov\/news-events\/news\/press-releases\/2025\/09\/ftc-launches-inquiry-ai-chatbots-acting-companions?utm_source=GovDelivery\" target=\"_blank\" rel=\"noopener\">opening inquiries into seven AI chatbot companies<\/a> \u2014 including the parent companies of Instagram and Facebook, Google, ChatGPT, Grok (the chatbot on X), Character.AI and Snapchat \u2014 on how they \u201cmeasure, test and monitor potentially negative impacts of this technology on children and teens.\u201d And the Food and Drug Administration is convening an advisory committee Nov. 6 to review <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/www.federalregister.gov\/documents\/2025\/09\/12\/2025-17651\/digital-health-advisory-committee-notice-of-meeting-establishment-of-a-public-docket-request-for\" target=\"_blank\" rel=\"noopener\">generative AI-enabled mental health devices<\/a>.<\/p>\n<p>Federal agencies could consider restrictions on how chatbots are marketed, limit addictive practices, require disclosures to users that they are not medical providers, require companies to track and report suicidal thoughts, and offer legal protections for people who report bad practices by companies, Wright said.<\/p>\n<p>Not all apps have blocked access<\/p>\n<p>From \u201ccompanion apps\u201d to \u201cAI therapists\u201d to \u201cmental wellness\u201d apps, AI\u2019s use in mental health care is varied and hard to define, let alone write laws around. <\/p>\n<p>That has led to different regulatory approaches. Some states, for example, take aim at <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/apnews.com\/article\/ai-chatbot-teens-congress-chatgpt-character-ce3959b6a3ea1a4997bf1ccabb4f0de2\" target=\"_blank\" rel=\"noopener\">companion apps<\/a> that are <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/apnews.com\/article\/ai-companion-generative-teens-mental-health-9ce59a2b250f3bd0187a717ffa2ad21f\" target=\"_blank\" rel=\"noopener\">designed just for friendship<\/a>, but don\u2019t wade into mental health care. The laws in Illinois and Nevada ban products that claim to provide mental health treatment outright, threatening fines up to $10,000 in Illinois and $15,000 in Nevada. <\/p>\n<p>But even a single app can be tough to categorize.<\/p>\n<p>Earkick\u2019s Stephan said there is still a lot that is \u201cvery muddy\u201d about Illinois\u2019 law, for example, and the company has not limited access there.<\/p>\n<p>Stephan and her team initially held off calling their chatbot, which looks like a cartoon panda, a therapist. But when users began using the word in reviews, they embraced the terminology so the app would show up in searches.<\/p>\n<p>Last week, they backed off using therapy and medical terms again. Earkick\u2019s website described its chatbot as \u201cYour empathetic AI counselor, equipped to support your mental health journey,\u201d but now it\u2019s a \u201cchatbot for self care.\u201d <\/p>\n<p>Still, \u201cwe\u2019re not diagnosing,\u201d Stephan maintained. <\/p>\n<p>Users can set up a \u201cpanic button\u201d to call a trusted loved one if they are in crisis and the chatbot will \u201cnudge\u201d users to seek out a therapist if their mental health worsens. But it was never designed to be a suicide prevention app, Stephan said, and police would not be called if someone told the bot about thoughts of self-harm.<\/p>\n<p>Stephan said she\u2019s happy that people are looking at AI with a critical eye, but worried about states\u2019 ability to keep up with innovation.<\/p>\n<p>\u201cThe speed at which everything is evolving is massive,\u201d she said.<\/p>\n<p>Other apps blocked access immediately. When Illinois users download the AI therapy app Ash, a message urges them to email their legislators, arguing \u201cmisguided legislation\u201d has banned apps like Ash \u201cwhile leaving unregulated chatbots it intended to regulate free to cause harm.\u201d<\/p>\n<p>A spokesperson for Ash did not respond to multiple requests for an interview.<\/p>\n<p>Mario Treto Jr., secretary of the Illinois Department of Financial and Professional Regulation, said the goal was ultimately to make sure licensed therapists were the only ones doing therapy.<\/p>\n<p>\u201cTherapy is more than just word exchanges,\u201d Treto said. \u201cIt requires empathy, it requires clinical judgment, it requires ethical responsibility, none of which AI can truly replicate right now.\u201d<\/p>\n<p>One chatbot company is trying to fully replicate therapy<\/p>\n<p>In March, a Dartmouth University-based team published the first known <a class=\"Link AnClick-LinkEnhancement\" data-gtm-enhancement-style=\"LinkEnhancementA\" href=\"https:\/\/ai.nejm.org\/doi\/full\/10.1056\/AIoa2400802\" target=\"_blank\" rel=\"noopener\">randomized clinical trial<\/a> of a generative AI chatbot for mental health treatment.<\/p>\n<p>The goal was to have the chatbot, called Therabot, treat people diagnosed with anxiety, depression or eating disorders. It was trained on vignettes and transcripts written by the team to illustrate an evidence-based response. <\/p>\n<p>The study found users rated Therabot similar to a therapist and had meaningfully lower symptoms after eight weeks compared with people who didn\u2019t use it. Every interaction was monitored by a human who intervened if the chatbot\u2019s response was harmful or not evidence-based.<\/p>\n<p>Nicholas Jacobson, a clinical psychologist whose lab is leading the research, said the results showed early promise but that larger studies are needed to demonstrate whether Therabot works for large numbers of people.<\/p>\n<p>\u201cThe space is so dramatically new that I think the field needs to proceed with much greater caution that is happening right now,\u201d he said.<\/p>\n<p>Many AI apps are optimized for engagement and are built to support everything users say, rather than challenging peoples\u2019 thoughts the way therapists do. Many walk the line of companionship and therapy, blurring intimacy boundaries therapists ethically would not. <\/p>\n<p>Therabot\u2019s team sought to avoid those issues.<\/p>\n<p>The app is still in testing and not widely available. But Jacobson worries about what strict bans will mean for developers taking a careful approach. He noted Illinois had no clear pathway to provide evidence that an app is safe and effective. <\/p>\n<p>\u201cThey want to protect folks, but the traditional system right now is really failing folks,\u201d he said. \u201cSo, trying to stick with the status quo is really not the thing to do.\u201d<\/p>\n<p>Regulators and advocates of the laws say they are open to changes. But today\u2019s chatbots are not a solution to the mental health provider shortage, said Kyle Hillman, who lobbied for the bills in Illinois and Nevada through his affiliation with the National Association of Social Workers.<\/p>\n<p>\u201cNot everybody who\u2019s feeling sad needs a therapist,\u201d he said. But for people with real mental health issues or suicidal thoughts, \u201ctelling them, \u2018I know that there\u2019s a workforce shortage but here\u2019s a bot\u2019 \u2014 that is such a privileged position.\u201d<\/p>\n<p>___<\/p>\n<p>The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute\u2019s Department of Science Education and the Robert Wood Johnson Foundation. The AP is solely responsible for all content.<\/p>\n","protected":false},"excerpt":{"rendered":"In the absence of stronger federal regulation, some states have begun regulating apps that offer AI \u201ctherapy\u201d as&hellip;\n","protected":false},"author":3,"featured_media":263341,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[7060,738,276,9062,57,328,210,1819,1818,59,65,137648,137646,137645,517,7062,617,991,401,2112,137647,11290,5217,2830,1739,6974,793,158,61,67,132,68,436,424],"class_list":{"0":"post-263340","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-mental-health","8":"tag-alphabet","9":"tag-artificial-intelligence","10":"tag-california","11":"tag-food-safety","12":"tag-general-news","13":"tag-government-regulations","14":"tag-health","15":"tag-il-state-wire","16":"tag-illinois","17":"tag-inc","18":"tag-information-technology","19":"tag-karin-andrea-stephan","20":"tag-kyle-hillman","21":"tag-mario-treto-jr","22":"tag-mental-health","23":"tag-meta-platforms","24":"tag-nevada","25":"tag-new-hampshire","26":"tag-new-jersey","27":"tag-nh-state-wire","28":"tag-nicholas-jacobson","29":"tag-nv-state-wire","30":"tag-pa-state-wire","31":"tag-pennsylvania","32":"tag-psychotherapy","33":"tag-snap","34":"tag-software","35":"tag-technology","36":"tag-u-s-news","37":"tag-united-states","38":"tag-unitedstates","39":"tag-us","40":"tag-ut-state-wire","41":"tag-utah"},"share_on_mastodon":{"url":"","error":"Validation failed: Text character limit of 500 exceeded"},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/263340","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=263340"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/263340\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/263341"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=263340"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=263340"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=263340"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}