{"id":413248,"date":"2025-11-29T17:05:11","date_gmt":"2025-11-29T17:05:11","guid":{"rendered":"https:\/\/www.europesays.com\/us\/413248\/"},"modified":"2025-11-29T17:05:11","modified_gmt":"2025-11-29T17:05:11","slug":"teens-are-mourning-their-ai-chatbots","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/413248\/","title":{"rendered":"Teens Are Mourning Their AI Chatbots"},"content":{"rendered":"<p class=\"storyParagraph\">&#13;<br \/>\n                                        The teen rebellion over disappearing AI chatbots has officially begun. After months of warning, the popular chatbot platform Character.AI has started cutting off access for users under 18, and the <a href=\"https:\/\/www.wsj.com\/tech\/ai\/character-ai-teen-access-mental-health-4ec02a43\" target=\"_blank\" rel=\"noopener\">Wall Street Journal<\/a> says teens have been taking it &#8230; not well. Posts of grief and frustration are popping up across Reddit, where one teen wrote, &#8220;I cried over it for days.&#8221; The company had already put a two-hour daily limit in place in November as the <a href=\"https:\/\/www.newser.com\/story\/377728\/chatbots-very-very-bold-move-banning-minors.html\" target=\"_blank\" rel=\"noopener\">first step<\/a> toward restricting underage users, but the policy has now escalated into a full ban following the deaths of two teen users and mounting scrutiny from regulators and mental health professionals. For many young users, the shift doesn&#8217;t just mean losing an app\u2014it means losing relationships they felt were meaningful. &#8220;I&#8217;m losing the memories I had with these bots,&#8221; says 13-year-old Olga Lopez.&#13;\n                                    <\/p>\n<p class=\"storyParagraph\">&#13;<br \/>\n                                        Several teens told the Journal they relied on chatbots for comfort when human conversations felt too hard or scarce. &#8220;I use this app for comfort when I can&#8217;t talk to my friends or therapist,&#8221; says one teen. Dr. Nina Vasan of Stanford Medicine said AI companions can feel like a blend of friend, performer, and mirror. &#8220;The difficulty logging off doesn&#8217;t mean something is wrong with the teen,&#8221; she says. &#8220;It means the tech worked exactly as designed.&#8221; Character.AI&#8217;s chief executive, Karandeep Anand, said the company felt compelled to intervene as it observed teens using bots for hours at a time or veering toward restricted topics. &#8220;This wasn&#8217;t a very hard decision,&#8221; he says.&#13;\n                                    <\/p>\n<p class=\"storyParagraph\">&#13;<br \/>\n                                        To guide the rollout, Character.AI consulted with teens through the nonprofit ConnectSafely, aiming to communicate the ban clearly, avoid condescension, and give young users time to download their chat histories. &#8220;We wanted to make sure teens didn&#8217;t feel abandoned,&#8221; says ConnectSafely&#8217;s Julianna Bryant. Character.AI apologized for the ban, telling teens in a letter: &#8220;We are deeply sorry.&#8221; But it insists the restriction is necessary. Meanwhile, mental health professionals tell <a href=\"https:\/\/www.cnbc.com\/2025\/11\/24\/characterai-to-ban-teens-from-open-ended-chats-human-interaction-is-crucial-psychotherapist-says.html\" target=\"_blank\" rel=\"noopener\">CNBC<\/a> that abruptly cutting off access to chatbots may itself be stressful for dependent users. But Character.AI says it&#8217;s aware, so the platform is adding emotional support tools and partnering with Koko and ThroughLine to help at-risk users find real-world help. But when the cutoff message finally appears, the app offers only a quiet goodbye: &#8220;A new chapter begins.&#8221;&#13;\n                                    <\/p>\n","protected":false},"excerpt":{"rendered":"&#13; The teen rebellion over disappearing AI chatbots has officially begun. After months of warning, the popular chatbot&hellip;\n","protected":false},"author":3,"featured_media":413249,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[21],"tags":[1556,691,738,91063,123404,517,158,3981,67,132,68],"class_list":{"0":"post-413248","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-addiction","9":"tag-ai","10":"tag-artificial-intelligence","11":"tag-character-ai","12":"tag-chatbot","13":"tag-mental-health","14":"tag-technology","15":"tag-teens","16":"tag-united-states","17":"tag-unitedstates","18":"tag-us"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115634079557108045","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/413248","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=413248"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/413248\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/413249"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=413248"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=413248"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=413248"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}