{"id":297326,"date":"2026-01-22T08:39:07","date_gmt":"2026-01-22T08:39:07","guid":{"rendered":"https:\/\/www.europesays.com\/ie\/297326\/"},"modified":"2026-01-22T08:39:07","modified_gmt":"2026-01-22T08:39:07","slug":"the-inventor-of-the-suicide-pod-says-ai-should-decide-who-can-end-their-life","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/ie\/297326\/","title":{"rendered":"The inventor of the &#8216;suicide pod&#8217; says AI should decide who can end their life"},"content":{"rendered":"<p>Philip Nitschke has spent more than three decades arguing that the right to die should belong to people, not doctors. <\/p>\n<p>Now, the Australian euthanasia campaigner behind the controversial Sarco pod &#8211; a 3D-printed capsule designed to allow a person to end their own life using nitrogen gas &#8211; says he believes <a href=\"https:\/\/www.euronews.com\/next\/2026\/01\/20\/ai-at-davos-2026-from-work-to-useful-and-safe-ai-heres-what-the-tech-leaders-have-said\" rel=\"nofollow noopener\" target=\"_blank\"><strong>artificial intelligence<\/strong><\/a> should replace psychiatrists in deciding who has the &#8220;mental capacity&#8221; to end their life.<\/p>\n<p>&#8220;We don\u2019t think doctors should be running around giving you permission or not to die,&#8221; Nitschke told Euronews Next. &#8220;It should be your decision if you\u2019re of sound mind.&#8221; <\/p>\n<p>The proposal has reignited debate about assisted dying and whether AI should ever be trusted with decisions as significant as life and death.<\/p>\n<p>&#8216;Suicide is a human right&#8217;<\/p>\n<p>Nitschke, a physician and the founder of the euthanasia non-profit Exit International, first became involved in assisted dying in the mid-1990s, when Australia\u2019s Northern Territory briefly legalised voluntary euthanasia for terminally ill patients.<\/p>\n<p>&#8220;I got involved 30-odd years ago when the world\u2019s first law came in,&#8221; he said. &#8220;I thought it was a good idea.&#8221;<\/p>\n<p>He made history in 1996 as the first doctor to legally administer a voluntary lethal injection, using a self-built machine that enabled Bob Dent, a man dying of prostate cancer, to activate the drugs by pressing a button on a laptop beside his bed.<\/p>\n<p>However, the law was short-lived and was repealed amid opposition from medical bodies and religious groups. The backlash, Nitschke says, was formative for him.<\/p>\n<p>&#8220;It did occur to me that if I was sick \u2013 or for that matter, even if I wasn\u2019t sick \u2013 I should be the one who controls the time and manner of my death,&#8221; he says. &#8220;I couldn\u2019t see why that should be restricted, and certainly why it should be illegal to receive assistance, given that suicide itself is not a crime.&#8221;<\/p>\n<p>Over time, his position hardened. What began as support for physician-assisted dying evolved into a broader belief that &#8220;the end of one\u2019s life by oneself is a human right,&#8221; regardless of illness or medical oversight.<\/p>\n<p>From plastic bags to pods<\/p>\n<p>The Sarco pod, named after the sarcophagus, grew out of Nitschke\u2019s work with people seeking to die in jurisdictions where assisted dying is illegal. Many, he says, were already using nitrogen gas \u2013 often with a plastic bag \u2013 to asphyxiate themselves.<\/p>\n<p>&#8220;That works very effectively,&#8221; he said. &#8220;But people don\u2019t like it. They don\u2019t like the idea of a plastic bag. Many would say, \u2018I don\u2019t want to die looking like that.\u2019&#8221;<\/p>\n<p>The Sarco pod was designed as a more dignified alternative: a 3D-printed capsule, shaped like a small futuristic vehicle, which floods with nitrogen when the user presses a button.<\/p>\n<p>Its spaceship-like appearance was an intentional design choice. &#8220;Let\u2019s make it look like a vehicle,&#8221; he recalls telling the designer. &#8220;Like you\u2019re going somewhere. You\u2019re leaving this planet, or whatever.&#8221;<\/p>\n<p>The decision to make Sarco 3D-printable, costing a reported $15,000 (\u20ac12,800) to manufacture, was also strategic. &#8220;If I actually give you something material, that\u2019s assisting suicide,\u201d he said. &#8220;But I can give away the program. That\u2019s information.&#8221;<\/p>\n<p>Legal trouble in Switzerland<\/p>\n<p>Sarco\u2019s first and only use in Switzerland in September 2024 triggered an international outcry. Police arrested several people, including Florian Willet, CEO of the assisted dying organisation The Last Resort, and opened criminal proceedings for aiding and abetting suicide. Swiss authorities later said the pod was incompatible with Swiss law.<\/p>\n<p>Willet was released from custody in December. Soon after, in May 2025, he died by assisted suicide in Germany. <\/p>\n<p>Swiss prosecutors have yet to determine whether charges will be laid over the Sarco case. The original device remains seized, though Nitschke says a new version &#8211; including a so-called &#8220;Double Dutch&#8221; pod designed for two people to die together &#8211; is already being built.<\/p>\n<p>An AI assessment of mental capacity<\/p>\n<p>Adding to the controversy is Nitschke\u2019s vision of incorporating artificial intelligence into the device. <\/p>\n<p>Under assisted dying laws worldwide, a person must be judged to have mental capacity &#8211; a determination typically made by psychiatrists. Nitschke believes that the process is deeply inconsistent.<\/p>\n<p>&#8220;I\u2019ve seen plenty of cases where the same patient, seeing three different psychiatrists, gets four different answers,&#8221; he said. &#8220;There is a real question about what this assessment of this nebulous quality actually is.&#8221;<\/p>\n<p>His proposed alternative is an AI system which uses a conversational avatar to evaluate capacity. &#8220;You sit there and talk about the issues that the avatar wants to talk to you about,&#8221; he said. &#8220;And the avatar will then decide whether or not it thinks you\u2019ve got capacity.&#8221;<\/p>\n<p>If the AI determines you are of sound mind, the suicide pod will be activated, giving you a 24-hour window to decide whether to proceed with the process. If that window expires, the AI test must begin again.<\/p>\n<p>Early versions of the software are already functioning, Nitschke says, though they have not been independently validated. For now, he hopes to run the AI assessments alongside psychiatric reviews.<\/p>\n<p>&#8220;Whether it\u2019s as good as a psychiatrist, whether it\u2019s got any biases built into it \u2013 we know AI assessments have involved bias,&#8221; he says. &#8220;We can do what we can to eliminate that.&#8221;<\/p>\n<p>Can AI be trusted?<\/p>\n<p>Psychiatrists remain sceptical. &#8220;I don\u2019t think I found a single one who thought it was a good idea,&#8221; he added.<\/p>\n<p>Critics warn that these systems risk interpreting emotional distress as informed consent, and raise concerns about how transparent, accountable or ethical it is to hand life-and-death decisions to an algorithm. <\/p>\n<p>&#8220;This clearly ignores the fact that technology itself is never neutral: It is developed, tested, deployed, and used by human beings, and in the case of so-called Artificial Intelligence systems, typically relies on data of the past,&#8221; said Angela M\u00fcller, policy and advocacy lead at Algorithmwatch, a non-profit organisation that researches the impact of automation technologies. <\/p>\n<p>&#8220;Relying on them, I fear, would rather undermine than enhance our autonomy, since the way they reach their decisions will not only be a black box to us but may also cement existing inequalities and biases,&#8221; she told Euronews <a href=\"https:\/\/www.euronews.com\/health\/2021\/12\/08\/the-sarco-suicide-pod-aims-to-take-assisted-dying-out-of-doctors-hands\" rel=\"nofollow noopener\" target=\"_blank\"><strong>in 2021.<\/strong><\/a><\/p>\n<p>These concerns are heightened by a growing number of high-profile cases involving <a href=\"https:\/\/www.euronews.com\/next\/2025\/11\/07\/openai-faces-fresh-lawsuits-claiming-chatgpt-drove-people-to-suicide-delusions\" rel=\"nofollow noopener\" target=\"_blank\"><strong>AI chatbots and vulnerable users<\/strong><\/a>. For example, last year, the parents of 16-year-old Adam Raine filed a lawsuit against OpenAI following their son\u2019s death by suicide, alleging that he had spent months confiding in ChatGPT. <\/p>\n<p>According to the claim, the chatbot failed to intervene when he discussed self-harm, did not encourage him to seek help, and at times provided information related to suicide methods &#8211; even offering to help draft a suicide note.<\/p>\n<p>But Nitschke believes that in this context, AI could offer something closer to neutrality than a human psychiatrist. &#8220;Psychiatrists bring their own preconceived ideas,&#8221; he said. &#8220;They convey that pretty well through their assessment of capacity.&#8221; <\/p>\n<p>&#8220;If you\u2019re an adult, and you\u2019ve got mental capacity, and you want to die, I would argue you\u2019ve got every right to have the means for a peaceful and reliable elective death,&#8221; he said. <\/p>\n<p>Whether regulators will ever accept such a system remains unclear. Even in Switzerland, one of the world\u2019s most permissive jurisdictions, authorities have pushed back hard against Sarco.<\/p>\n","protected":false},"excerpt":{"rendered":"Philip Nitschke has spent more than three decades arguing that the right to die should belong to people,&hellip;\n","protected":false},"author":2,"featured_media":297327,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[261],"tags":[291,289,290,18,147420,19,17,171,7001,2423,82,836],"class_list":{"0":"post-297326","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-artificial-intelligence","8":"tag-ai","9":"tag-artificial-intelligence","10":"tag-artificialintelligence","11":"tag-eire","12":"tag-euthanasia","13":"tag-ie","14":"tag-ireland","15":"tag-psychiatry","16":"tag-suicide","17":"tag-switzerland","18":"tag-technology","19":"tag-video"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@ie\/115937854277063725","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/297326","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/comments?post=297326"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/posts\/297326\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media\/297327"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/media?parent=297326"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/categories?post=297326"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/ie\/wp-json\/wp\/v2\/tags?post=297326"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}