{"id":277815,"date":"2025-07-20T16:41:09","date_gmt":"2025-07-20T16:41:09","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/277815\/"},"modified":"2025-07-20T16:41:09","modified_gmt":"2025-07-20T16:41:09","slug":"uk-health-service-ai-tool-generated-a-set-of-false-diagnoses-for-a-patient","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/277815\/","title":{"rendered":"UK health service AI tool generated a set of false diagnoses for a patient"},"content":{"rendered":"<p>AI use in healthcare has the potential to save time, money, and lives. But when technology that is known to <a href=\"https:\/\/fortune.com\/2025\/06\/29\/ai-lies-schemes-threats-stress-testing-claude-openai-chatgpt\/\" target=\"_self\" aria-label=\"Go to https:\/\/fortune.com\/2025\/06\/29\/ai-lies-schemes-threats-stress-testing-claude-openai-chatgpt\/\" class=\"sc-19cc8fd2-0 iHosVH\" rel=\"noopener\">occasionally lie<\/a> is introduced into patient care, it also raises serious risks.<\/p>\n<p>One London-based patient recently experienced just how serious those risks can be after receiving a letter inviting him to a diabetic eye screening\u2014a standard <a href=\"https:\/\/www.nhs.uk\/tests-and-treatments\/diabetic-eye-screening\/\" target=\"_blank\" rel=\"noopener\" aria-label=\"Go to https:\/\/www.nhs.uk\/tests-and-treatments\/diabetic-eye-screening\/\" class=\"sc-19cc8fd2-0 iHosVH\">annual check-up <\/a>for people with diabetes in the UK. The problem: He had never been diagnosed with diabetes or shown any signs of the condition.<\/p>\n<p>After opening the appointment letter late one evening, the patient, a healthy man in his mid-20\u2019s, told Fortune he had briefly worried that he had been unknowingly diagnosed with the condition, before concluding the letter must just be an admin error. The next day, at a pre-scheduled routine blood test, a nurse questioned the diagnosis and, when the patient confirmed he wasn\u2019t diabetic, the pair reviewed his medical history.<\/p>\n<p>\u201cHe showed me the notes on the system, and they were AI-generated summaries. It was at that point I realized something weird was going on,\u201d the patient, who asked for anonymity to discuss private health information, told Fortune.<\/p>\n<p>After requesting and reviewing his medical records in full, the patient noticed the entry that had introduced the diabetes diagnosis was listed as a summary that had been \u201cgenerated by Annie AI.\u201d The record appeared around the same time he had attended the hospital for a severe case of tonsillitis. However, the record in question made no mention of tonsillitis. Instead, it said he had presented with chest pain and shortness of breath, attributed to a \u201clikely angina due to coronary artery disease.\u201d In reality, he had none of those symptoms.<\/p>\n<p>The records, which were reviewed by Fortune, also noted the patient had been diagnosed with Type 2 diabetes late last year and was currently on a series of medications. It also included dosage and administration details for the drugs. However, none of these details were accurate, according to the patient and several other medical records reviewed by Fortune.<\/p>\n<p>\u2018Health Hospital\u2019 in \u2018Health City\u2019<\/p>\n<p>Even stranger, the record attributed the address of the medical document it appeared to be processing to a fictitious \u201cHealth Hospital\u201d located on \u201c456 Care Road\u201d in \u201cHealth City.\u201d The address also included an invented postcode.<\/p>\n<p>A representative for the NHS, Dr. Matthew Noble<strong>,<\/strong> told Fortune the GP practice responsible for the oversight employs a \u201climited use of supervised AI\u201d and the error was a \u201cone-off case of human error.\u201d He said that a medical summariser had initially spotted the mistake in the patient\u2019s record but had been distracted and \u201cinadvertently saved the original version rather than the updated version [they] had been working on.\u201d<\/p>\n<p>However, the fictitious AI-generated record appears to have had downstream consequences, with the patient\u2019s invitation to attend a diabetic eye screening appointment presumedly based on the erroneous summary.\u00a0<\/p>\n<p>While most AI tools used in healthcare are monitored by strict human oversight, another NHS worker told Fortune that the leap from the original symptoms\u2014tonsillitis\u2014to what was returned\u2014likely angina due to coronary artery disease\u2014raised alarm bells.<\/p>\n<p>\u201cThese human error mistakes are fairly inevitable if you have an AI system producing completely inaccurate summaries,\u201d the NHS employee said. \u201cMany elderly or less literate patients may not even know there was an issue.\u201d<\/p>\n<p>The company behind the technology, Anima Health, did not respond to Fortune\u2019s questions about the issue. However, Dr. Noble said, \u201cAnima is an NHS-approved document management system that assists practice staff in processing incoming documents and actioning any necessary tasks.\u201d<\/p>\n<p>\u201cNo documents are ever processed by AI, Anima only suggests codes and a summary to a human reviewer in order to improve safety and efficiency. Each and every document requires review by a human before being actioned and filed,\u201d he added.<\/p>\n<p>AI\u2019s uneasy rollout in the health sector<\/p>\n<p>The incident is somewhat emblematic of the growing pains around AI\u2019s rollout in healthcare. As hospitals and GP practices race to adopt automation tools that promise to ease workloads and reduce costs, they\u2019re also grappling with the challenge of integrating still-maturing technology into high-stakes environments.\u00a0<\/p>\n<p>The pressure to innovate and potentially save lives with the technology is high, but so is the need for rigorous oversight, especially as tools once seen as \u201cassistive\u201d begin influencing real patient care.<\/p>\n<p>The company behind the tech, Anima Health, promises healthcare professionals can \u201csave hours per day through automation.\u201d The company offers services including automatically generating \u201cthe patient communications, clinical notes, admin requests, and paperwork that doctors deal with daily.\u201d<\/p>\n<p>Anima\u2019s AI tool, Annie, is registered with the UK\u2019s Medicines and Healthcare products Regulatory Agency (MHRA) as a Class I medical device. This means it is regarded as low-risk and designed to assist clinicians, such as <a href=\"https:\/\/innovation.nhs.uk\/innovation-guides\/regulation\/general-medical-device-and-active-implantable-medical-devices\/\" target=\"_blank\" rel=\"noopener\" aria-label=\"Go to https:\/\/innovation.nhs.uk\/innovation-guides\/regulation\/general-medical-device-and-active-implantable-medical-devices\/\" class=\"sc-19cc8fd2-0 iHosVH\">examination lights or bandages<\/a>, rather than automate medical decisions. <\/p>\n<p>AI tools in this category require outputs to be reviewed by a clinician before action is taken or items are entered into the patient record. However, in this case of the misdiagnosed patient, the practice appeared to fail to appropriately address the factual errors before they were added to the patient\u2019s records.<\/p>\n<p>The incident comes amid increased scrutiny within the UK\u2019s health service of the use and categorization of AI technology. Last month, bosses for the health service warned GPs and hospitals that some current uses of AI software could breach data protection rules and put patients at risk.<\/p>\n<p>In an email first reported<a href=\"https:\/\/news.sky.com\/story\/doctors-are-using-unapproved-ai-software-to-record-patient-meetings-investigation-reveals-13387765\" target=\"_blank\" rel=\"noopener\" aria-label=\"Go to https:\/\/news.sky.com\/story\/doctors-are-using-unapproved-ai-software-to-record-patient-meetings-investigation-reveals-13387765\" class=\"sc-19cc8fd2-0 iHosVH\"> by Sky News<\/a> and confirmed by Fortune, NHS England warned that unapproved AI software that breached minimum standards could risk putting patients at harm. The letter specifically addressed the use of Ambient Voice Technology, or \u201cAVT\u201d by some doctors.<\/p>\n<p>The main issue with AI transcribing or summarizing information is the manipulation of the original text, Brendan Delaney, professor of Medical Informatics and Decision Making at Imperial College London and a PT General Practitioner, told Fortune.<\/p>\n<p>\u201cRather than just simply passively recording, it gives it a medical device purpose,\u201d Delaney said. The recent guidance issued by the NHS, however, has meant that some companies and practices are playing regulatory catch-up.\u00a0<\/p>\n<p>\u201cMost of the devices now that were in common use now have a Class One [categorization],\u201d Delaney said. \u201cI know at least one, but probably many others are now scrambling to try and start their Class 2a, because they ought to have that.\u201d<\/p>\n<p>Whether a device should be defined as a Class 2a medical device essentially depends on its intended purpose and the level of clinical risk. Under U.K. medical device rules, if the tool\u2019s output is relied upon to inform care decisions, it could require reclassification as a Class 2a medical device, a category subject to stricter regulatory controls.<\/p>\n<p>Anima Health, along with other UK-based health tech companies, is <a href=\"https:\/\/www.animahealth.com\/blog\/anima-full-compliance-with-nhs-england-avt-requirements\" target=\"_blank\" rel=\"noopener\" aria-label=\"Go to https:\/\/www.animahealth.com\/blog\/anima-full-compliance-with-nhs-england-avt-requirements\" class=\"sc-19cc8fd2-0 iHosVH\">currently pursuing Class 2a registration.<\/a><\/p>\n<p>The U.K.\u2019s AI for health push<\/p>\n<p>The U.K. government is embracing the possibilities of AI in healthcare, hoping it can boost the country\u2019s strained national health system.<\/p>\n<p>In a recent<a href=\"https:\/\/www.gov.uk\/government\/publications\/10-year-health-plan-for-england-fit-for-the-future\" target=\"_blank\" rel=\"noopener\" aria-label=\"Go to https:\/\/www.gov.uk\/government\/publications\/10-year-health-plan-for-england-fit-for-the-future\" class=\"sc-19cc8fd2-0 iHosVH\"> \u201c10-Year Health Plan,\u201d<\/a> the British government said it aims to make the NHS the most AI-enabled care system in the world, using the tech to reduce admin burden, support preventive care, and empower patients through technology.<\/p>\n<p>But rolling out this technology in a way that meets current rules within the organization is complex. Even the U.K.\u2019s health minister appeared to suggest earlier this year that some doctors may be pushing the limits when it comes to integrating AI technology in patient care.<\/p>\n<p>\u201cI\u2019ve heard anecdotally down the pub, genuinely down the pub, that some clinicians are getting ahead of the game and are already using ambient AI to kind of record notes and things, even where their practice or their trust haven\u2019t yet caught up with them,\u201d Wes Streeting said, in comments reported by Sky News.<\/p>\n<p>\u201cNow, lots of issues there\u2014not encouraging it\u2014but it does tell me that contrary to this, \u2018Oh, people don\u2019t want to change, staff are very happy and they are really resistant to change\u2019, it\u2019s the opposite. People are crying out for this stuff,\u201d he added.<\/p>\n<p>AI tech certainly has huge possibilities to dramatically improve speed, accuracy, and access to care, especially in areas like diagnostics, medical recordkeeping, and reaching patients in under-resourced or remote settings. However, walking the line between the tech\u2019s potential and risks is difficult in sectors like healthcare that deal with sensitive data and could cause significant harm.<\/p>\n<p>Reflecting on his experience, the patient told Fortune: \u201cIn general, I think we should be using AI tools to support the NHS. It has massive potential to save money and time. However, LLMs are still really experimental, so they should be used with stringent oversight. I would hate this to be used as an excuse to not pursue innovation but instead should be used to highlight where caution and oversight are needed.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"AI use in healthcare has the potential to save time, money, and lives. But when technology that is&hellip;\n","protected":false},"author":2,"featured_media":277816,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,4],"tags":[748,393,4884,105,211,1144,712,53,16,15,1764],"class_list":{"0":"post-277815","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-uk","8":"category-united-kingdom","9":"tag-britain","10":"tag-england","11":"tag-great-britain","12":"tag-health","13":"tag-nhs","14":"tag-northern-ireland","15":"tag-scotland","16":"tag-technology","17":"tag-uk","18":"tag-united-kingdom","19":"tag-wales"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/114886559925522849","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/277815","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=277815"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/277815\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/277816"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=277815"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=277815"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=277815"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}