{"id":383030,"date":"2026-03-13T11:26:54","date_gmt":"2026-03-13T11:26:54","guid":{"rendered":"https:\/\/www.europesays.com\/ie\/383030\/"},"modified":"2026-03-13T11:26:54","modified_gmt":"2026-03-13T11:26:54","slug":"why-millions-are-turning-to-chatgpt-for-mental-health","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ie\/383030\/","title":{"rendered":"Why Millions Are Turning to ChatGPT for Mental Health"},"content":{"rendered":"<p><strong>Summary: <\/strong>As traditional healthcare systems struggle with long waiting lists and rising costs, a massive global survey reveals a seismic shift in public trust toward Artificial Intelligence. The study, involving 31,000 adults across 35 countries, found that 41% of UK adults (and 61% globally) are now comfortable using ChatGPT as a mental health counselor.<\/p>\n<p>While AI\u2019s non-judgmental tone and 24\/7 availability offer a sense of security and companionship for many, experts warn that these tools are \u201cno substitute\u201d for professional care and raise concerns about the long-term impact on cognitive functions like memory and learning.<\/p>\n<p><strong>Key Facts<\/strong><\/p>\n<ul class=\"wp-block-list\">\n<li><strong>The \u201cCounselor\u201d Shift:<\/strong> 41% of UK respondents would use AI for counseling, likely driven by long wait times for traditional mental health services.<\/li>\n<li><strong>Universal Companion:<\/strong> Three-quarters of people globally (and over half in the UK) are willing to use AI as a friend or companion, drawn to its adaptive tone and \u201cprivate\u201d conversation feel.<\/li>\n<li><strong>Trust in Medicine:<\/strong> 45% of people globally (25% in the UK) would trust AI to act as their doctor, with higher trust levels in regions where healthcare is expensive or inaccessible.<\/li>\n<li><strong>Educational Concerns:<\/strong> 25% of UK adults would delegate teaching their children to AI. Researchers warn this could lead to \u201cprompt-focused\u201d learning rather than deep information retention.<\/li>\n<li><strong>Biological Risks:<\/strong> Neuroscientists expressed concern that replacing traditional learning with excessive AI reliance could physically shrink the <strong>hippocampus<\/strong>, the brain region critical for memory and spatial awareness.<\/li>\n<\/ul>\n<p><strong>Source: <\/strong>Bornemouth University<\/p>\n<p><strong>More than 4 in 10 adults in the UK are happy to use ChatGPT for their mental health support, new research suggests.\u00a0<\/strong><\/p>\n<p>The study, led by Bournemouth University surveyed\u00a0nearly 31,000\u00a0adults in 35 countries\u00a0about their use of\u00a0Artificial\u00a0Intelligence\u00a0(AI)\u00a0large language models such as ChatGPT. The research\u00a0also\u00a0discovered that:\u00a0<\/p>\n<ul class=\"wp-block-list\">\n<li>One quarter of UK adults would be happy to delegate the role of teaching their children to AI.\u00a0<\/li>\n<li>Globally, 45% of people would trust AI models to take on the role of their doctor.\u00a0<\/li>\n<li>Three quarters of people surveyed said they\u00a0would use an AI chat tool as a companion and a friend.\u00a0\u00a0\u00a0<\/li>\n<\/ul>\n<p>The study has been published in the journal\u00a0AI and Society.\u00a0\u00a0<\/p>\n<p>Dr Ala\u00a0Yankouskaya, Senior Lecturer in Psychology at Bournemouth University\u00a0who led the study said:\u00a0\u201cWith the rapid development and mass availability of AI,\u00a0more\u00a0people are placing their trust in it.\u00a0We wanted to learn more about how people would trust generative AI tools, such as ChatGPT, to carry out some of the most important roles in their daily lives.\u201d\u00a0\u00a0<\/p>\n<p><strong>AI for mental health support\u00a0\u00a0<\/strong><\/p>\n<p>41% of participants from the UK, and 61% globally, said that they would be happy to using AI for counselling services. The researchers\u00a0suggest\u00a0that for the\u00a0UK, this\u00a0could be the result of the waiting times many people face to access the mental health services that they need.\u00a0\u00a0<\/p>\n<p>\u201cIf someone is experiencing depression, they do not want to wait months for an appointment, so instead they can turn to AI,\u201d Dr\u00a0Yankouskaya\u00a0said.<\/p>\n<p>\u201cHowever, when I tested some of the tools myself,\u00a0I found the language\u00a0used very vague and confusing because the developers are careful not to jump into providing diagnoses.\u00a0So,\u00a0it is no substitute for speaking to a health professional.\u201d\u00a0<\/p>\n<p>The researchers also noted that users were\u00a0already\u00a0familiar with NHS chatbots, which use similar AI technology, and this could be normalising their use of AI in other apps such as ChatGPT\u00a0for\u00a0their mental health care.\u00a0<\/p>\n<p><strong>AI as a teacher\u00a0<\/strong><\/p>\n<p>A quarter of people in the UK and half of everyone surveyed globally said that they would trust AI to carry out the role of\u00a0a\u00a0teacher, which the research\u00a0team found particularly concerning.\u00a0\u00a0<\/p>\n<p>\u201cIt really knocked me down when I saw how many people would be willing to delegate\u00a0AI to\u00a0the role of teaching their children,\u201d\u00a0Dr\u00a0Yankouskaya\u00a0explained.<\/p>\n<p>\u201cWe still do not know the long-term effects that using these tools for education could have on children\u2019s memory and cognitive functions. We could be heading to the stage where we are developing children who are good at putting prompts into AI tools but not as good at taking the information in,\u201d she continued.\u00a0\u00a0<\/p>\n<p>The researchers were also concerned about the long-term physical effects on the brain if learning information in the traditional way\u00a0was\u00a0replaced by\u00a0excessive\u00a0search-engine\u00a0use,\u00a0and whether this could shrink the hippocampus region of the brain\u00a0that used for spatial awareness and learning.\u00a0<\/p>\n<p><strong>AI as a doctor\u00a0<\/strong><\/p>\n<p>45% of all respondents and 25% in the UK said that they would trust AI to carry out the role of their doctor. The numbers were particularly higher in countries where healthcare is more expensive and harder to access.\u00a0\u00a0<\/p>\n<p>This\u00a0wasn\u2019t\u00a0as surprising to the researchers who believe\u00a0people\u00a0that\u00a0live in parts of the world where\u00a0access to health care services\u00a0is not readily available, might rely on technology for quick answers.\u00a0\u00a0<\/p>\n<p>However, they were cautious about the\u00a0underlying\u00a0algorithm\u00a0used to\u00a0retain\u00a0the user\u2019s attention and keep them in a relaxed chat. This might be more harmful for mental health advice, where traditional\u00a0methods of advice might be to alert the user to specific services such as The Samaritans.\u00a0<\/p>\n<p><strong>AI as a companion\u00a0<\/strong><\/p>\n<p>The highest amount of trust participants\u00a0were\u00a0willing to place in AI came in the role of friendship. Over three quarters of people globally and over half of people in the UK said they would talk to ChatGPT as a companion.\u00a0\u00a0<\/p>\n<p>The researchers think this is explained by a perceived sense of empathy from generative language tools because they are designed to adapt the tone of their responses to the suit the user\u2019s.\u00a0<\/p>\n<p>\u201cAI tools come across as a friend who knows you well and understands you,\u201d Dr\u00a0Yankouskaya\u00a0explained.<\/p>\n<p>\u201cChatGPT can remember every chat it has had with a\u00a0user\u00a0and it feels like a private conversation between them. Nowadays people can be\u00a0very sensitive\u00a0to being judged and AI tools are designed to be non-judgemental. This means they can provide the sense of security people need,\u201d she continued.\u00a0\u00a0<\/p>\n<p>Dr\u00a0Yankouskaya\u00a0and the team concluded\u00a0that as the prospect of AI playing a bigger role in people\u2019s lives moves from a theoretical prospect to reality, there needs to be more awareness within societies about how generative AI tools work and their limitations.<\/p>\n<p>The lack of knowledge about the\u00a0long-term\u00a0effects on someone\u2019s memory means caution needs to be applied before they take over roles in\u00a0education in particular.\u00a0\u00a0<\/p>\n<p>Key Questions Answered:<strong class=\"schema-faq-question\">Q: Why would someone talk to a computer instead of a real therapist?<\/strong><\/p>\n<p class=\"schema-faq-answer\"><strong>A:<\/strong> It often comes down to access and judgment. If you\u2019re struggling with depression, waiting months for an appointment isn\u2019t an option. AI provides an immediate \u201clistening ear.\u201d Furthermore, people are often sensitive about being judged; AI is designed to be non-judgmental and \u201cremembers\u201d every past conversation, making it feel like a supportive, private friend.<\/p>\n<p><strong class=\"schema-faq-question\">Q: Can AI actually give good medical or mental health advice?<\/strong><\/p>\n<p class=\"schema-faq-answer\"><strong>A:<\/strong> Not quite. While it can offer general support, researchers found that the language used by AI is often vague and confusing because developers are careful not to provide clinical diagnoses. It can\u2019t replace the nuance and safety of a human professional, especially in crisis situations where specific emergency services are needed.<\/p>\n<p><strong class=\"schema-faq-question\">Q: Is using AI for everything bad for my brain?<\/strong><\/p>\n<p class=\"schema-faq-answer\"><strong>A:<\/strong> It might be. Researchers are worried about \u201ccognitive outsourcing.\u201d If we rely on AI to find every answer and teach our children, we might stop using the parts of our brain responsible for deep memory and learning. Over time, this lack of \u201cmental exercise\u201d could lead to a smaller hippocampus and reduced cognitive flexibility.<\/p>\n<p>Editorial Notes:<\/p>\n<ul style=\"background-color:#ffffe8\" class=\"wp-block-list has-background\">\n<li>This article was edited by a Neuroscience News editor.<\/li>\n<li>Journal paper reviewed in full.<\/li>\n<li>Additional context added by our staff.<\/li>\n<\/ul>\n<p>About this AI and psychology research news<\/p>\n<p class=\"has-background\" style=\"background-color:#ffffe8\"><strong>Author: <\/strong><a href=\"http:\/\/neurosciencenews.com\/cdn-cgi\/l\/email-protection#6516070411001625070a10170b00080a10110d4b04064b100e\" type=\"mailto\" id=\"mailto:sbates@bournemouth.ac.uk\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Steve Bates<\/a><br \/><strong>Source: <\/strong><a href=\"https:\/\/bournemouth.ac.uk\" type=\"link\" id=\"https:\/\/bournemouth.ac.uk\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Bournemouth University<\/a><br \/><strong>Contact: <\/strong>Steve Bates \u2013 Bournemouth University<br \/><strong>Image:<\/strong> The image is credited to Neuroscience News<\/p>\n<p class=\"has-background\" style=\"background-color:#ffffe8\"><strong>Original Research: <\/strong>Open access.<br \/>\u201c<a href=\"https:\/\/dx.doi.org\/10.1007\/s00146-026-02858-5\" type=\"link\" id=\"http:\/\/dx.doi.org\/10.1007\/s00146-026-02858-5\" target=\"_blank\" rel=\"noreferrer noopener nofollow\">Who lets AI take over? Cross-national variation in willingness to delegate socially important roles to artificial intelligence<\/a>\u201d by Ala Yankouskaya,\u00a0Mohamed Basel Almourad,\u00a0Magnus Liebherr,\u00a0Fahad Beyahi,\u00a0Guandong Xu\u00a0&amp;\u00a0Raian Ali. AI &amp; Society<br \/><strong>DOI:10.1007\/s00146-026-02858-5<\/strong><\/p>\n<p><strong>Abstract<\/strong><\/p>\n<p><strong>Who lets AI take over? Cross-national variation in willingness to delegate socially important roles to artificial intelligence<\/strong><\/p>\n<p>Delegating socially significant roles to artificial intelligence (AI) is an emerging reality, yet little is known about how publics evaluate this transfer of responsibility across contexts and countries.<\/p>\n<p>This study applied a structural model to a large cross-national dataset (30,994 individuals in 35 countries) to test how cognitive appraisals, affective dispositions, and contextual factors jointly shape willingness to delegate socially important roles of companionship, mental health advisor, doctor and teacher to children to AI.<\/p>\n<p>The results revealed a robust hierarchy of delegation preferences, with companionship most frequently entrusted to AI, followed by mental-health advisor, teacher, and doctor. Cognitive appraisals emerged as the strongest predictors: trust in online information was consistently the most powerful driver across all roles, while optimism and life satisfaction made smaller but reliable contributions.<\/p>\n<p>Affective dispositions played narrower, domain-specific roles, with anxiety shaping delegation in teaching and mental health, and loneliness linked only weakly to companionship. Women were less willing than men to delegate across all roles, with the gender gap largest in medicine and education, and strikingly invariant across cognitive and affective predictors.<\/p>\n<p>Beyond these, national baselines diverged by nearly 30 percentage points even after adjusting for these predictors demonstrating the independent influence of country context.<\/p>\n<p>Our findings show that willingness to delegate socially important roles to AI follows a robust hierarchy and reflects the combined influence of cognitive appraisals, affective dispositions, and contextual factors. A key implication is that delegation roles to AI must be understood as both a personal and a societal orientation, requiring attention to the interplay between these layers.<\/p>\n","protected":false},"excerpt":{"rendered":"Summary: As traditional healthcare systems struggle with long waiting lists and rising costs, a massive global survey reveals&hellip;\n","protected":false},"author":2,"featured_media":383031,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[275],"tags":[291,289,175670,297,7042,18,135,475,474,48949,19,17,167,1281,4226],"class_list":{"0":"post-383030","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-healthcare","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-bournemouth-university","11":"tag-chatgpt","12":"tag-cognitive-psychology","13":"tag-eire","14":"tag-health","15":"tag-health-care","16":"tag-healthcare","17":"tag-hippocampus","18":"tag-ie","19":"tag-ireland","20":"tag-mental-health","21":"tag-neuroscience","22":"tag-psychology"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@ie\/116221630563870192","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/383030","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/comments?post=383030"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/383030\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media\/383031"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media?parent=383030"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/categories?post=383030"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/tags?post=383030"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}