{"id":187474,"date":"2025-08-30T15:03:35","date_gmt":"2025-08-30T15:03:35","guid":{"rendered":"https:\/\/www.europesays.com\/us\/187474\/"},"modified":"2025-08-30T15:03:35","modified_gmt":"2025-08-30T15:03:35","slug":"crisis-hotlines-are-getting-wiped-out-leaving-despondent-people-at-the-mercy-of-ai","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/us\/187474\/","title":{"rendered":"Crisis hotlines are getting wiped out, leaving despondent people at the mercy of AI"},"content":{"rendered":"<p class=\"paragraph-block article-body undefined text-left\">For those who are suicidal and seeking help, every second matters.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">But across the country, crisis hotlines designed to intervene in these crucial moments are going dark, thanks to funding cuts and policy changes, leaving vulnerable people without one of the best tools professionals have found to prevent self-harm \u2014 or worse.\u00a0\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Meanwhile, the exponential progress and adoption of AI have left many reliant on chatbot tools not designed for suicide prevention. A <a href=\"https:\/\/www.commonsensemedia.org\/sites\/default\/files\/research\/report\/talk-trust-and-trade-offs_2025_web.pdf\" target=\"_blank\" rel=\"noopener\">survey published in July<\/a> by Common Sense Media found that nearly 1 in 8 teenagers had sought \u201cemotional or mental health support\u201d from chatbots.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Alex, a 15-year-old from Southern California, found the California Peer-Run Warm Line two years ago, relying on the free 24\/7 call and text service to talk about mental health and vent about migraines, vision problems, and brain fog they experience due to functional neurological disorder.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">But a few weeks ago, Alex, whose name has been changed to protect their privacy, noticed that responses to written messages were slower to arrive, less personal, and scattered with spelling errors.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">It turned out that the Mental Health Association of San Francisco, which operates the service, is losing 80% of its state funding and plans to lay off 200 of its 250 staffers Sept. 15.<\/p>\n<p><img alt=\"A robotic hand holds a classic black telephone handset, with its cord dangling below against a gradient gray background.\" loading=\"lazy\" width=\"1080\" height=\"1600\" decoding=\"async\" data-nimg=\"1\" class=\"block lazyloaded\" style=\"color:transparent;background-size:cover;background-position:50% 50%;background-repeat:no-repeat;background-image:url(&quot;data:image\/svg+xml;charset=utf-8,%3Csvg xmlns='http:\/\/www.w3.org\/2000\/svg' viewBox='0 0 1080 1600'%3E%3Cfilter id='b' color-interpolation-filters='sRGB'%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3CfeColorMatrix values='1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 100 -1' result='s'\/%3E%3CfeFlood x='0' y='0' width='100%25' height='100%25'\/%3E%3CfeComposite operator='out' in='s'\/%3E%3CfeComposite in2='SourceGraphic'\/%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3C\/filter%3E%3Cimage width='100%25' height='100%25' x='0' y='0' preserveAspectRatio='none' style='filter: url(%23b);' href='data:image\/gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw=='\/%3E%3C\/svg%3E&quot;)\"   src=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/08\/-S3840x5689-FPNG.png\"\/>Source: Photo illustration by The Standard<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">An employee who asked to remain anonymous said many counselors have felt deflated and left their jobs ahead of the layoffs. With fewer counselors, coordinators usually tasked with training and administrative tasks have had to step in.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Mental Health Association of San Francisco CEO Mark Salazar said that with the funding cut, the organization will be able to respond to only 20% of the 30,000 calls it gets in an average month.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Although Alex doesn\u2019t love the idea of AI-powered mental health tools, they vent about chronic pain to ChatGPT when friends aren\u2019t awake or the warm line is unhelpful. \u201cIt feels silly, but it\u2019s better than nothing,\u201d Alex said.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Across the country, traditional mental health hotlines are being silenced. In July, the Trump administration <a href=\"https:\/\/support.google.com\/youtube\/answer\/2814000?sjid=14420716823019403330-NA\" target=\"_blank\" rel=\"noopener\">axed<\/a> the national 988 Suicide and Crisis Lifeline\u2019s specialized services for LGBTQ+ youth. Next month, BRAVE Bay Area, the <a href=\"https:\/\/oaklandside.org\/2025\/07\/14\/alameda-county-first-rape-crisis-center-closing-funding-cuts\/\" target=\"_blank\" rel=\"noopener\">nation\u2019s first rape crisis center<\/a> and hotline operator, will close due to financial issues.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">The \u201cforeclosure of crisis hotlines \u2026 is no doubt driving people [to AI],\u201d said Valerie Black, an anthropologist at UCSF who studies human-technology relationships. For people with mental health issues, the allure of AI is clear: It\u2019s cheap, it\u2019s always available, and it won\u2019t dispatch police to your home.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">But while many report positive interactions with AI resources, there are serious risks. A <a href=\"https:\/\/www.chch.com\/chch-news\/chatgpt-can-create-suicide-plans-notes-for-vulnerable-kids-study\/\" target=\"_blank\" rel=\"noopener\">study<\/a> from the Center for Countering Digital Hate found that half of ChatGPT\u2019s responses to various prompts gave teens dangerous advice on drug use and self-harm, or even drafted suicide notes.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\"><a href=\"https:\/\/sfstandard.com\/2025\/08\/26\/family-blames-sam-altman-chatgpt-teen-son-s-suicide\/\" data-post-id=\"a08de870-9cf3-4686-8d88-9275dace1441\" target=\"_blank\" rel=\"noopener\">OpenAI and Sam Altman were blamed<\/a> for the suicide of a 16-year-old who talked regularly with ChatGPT about taking his own life, according to a wrongful death lawsuit filed Tuesday in San Francisco Superior Court by the teen&#8217;s parents.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Additionally, doctors have started to report the emergence of new mental health challenges arising from the technologies.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Keith Sakata, a psychiatry resident at UCSF, said he has seen 12 patients hospitalized \u201cafter losing touch with reality because of AI\u201d since the beginning of the year. Sakata said the pace of AI\u2019s advancement, coupled with America\u2019s loneliness epidemic, \u201ccould create a perfect storm.\u201d<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">The very thing people are turning to in their time of need, Sakata fears, may be what makes them even sicker.<\/p>\n<p>&#8216;The new normal&#8217;<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Whenever that familiar anxiety creeps over Clifford Bauman at night, the Army veteran types out his feelings to Earkick, a San Francisco-based AI mental health app.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Sometimes his mind drifts back to a cloudless September morning 24 years ago in Washington, D.C. He was a noncommissioned officer in the National Guard and was scheduled to be in the Pentagon; he was <a href=\"https:\/\/www.usatoday.com\/in-depth\/news\/investigations\/surviving-suicide\/2018\/11\/28\/military-suicides-servicemember-veterans-ptsd-suicide-attempt-survival\/971633002\/\" target=\"_blank\" rel=\"noopener\">a block away<\/a> when a Boeing 757 crashed into the west side of the building. He crawled through the wreckage, passing the bodies of friends and colleagues.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">As time passed, he suffered severe PTSD, withdrawing from friends and family and internalizing his trauma. In 2002, at his brother\u2019s house in Missouri, he wrote a note on a napkin and took 20 sleeping pills, hoping never to wake up. He had never tried to call a suicide hotline because he didn\u2019t feel he could trust anyone to understand his pain.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">If a service like Earkick had existed back then, Bauman believes, it \u201cwould have kept me from attempting [to take] my life.\u201d\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Earkick CEO and cofounder Karin Andrea Stephan said users have said the app helped them break away from toxic partners or quit drugs.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cWhat we\u2019re seeing is that it\u2019s becoming the new normal,\u201d Stephan said. But she believes AI should support crisis care \u2014 not replace it. \u201cHelplines are very important. They should never be defunded,\u201d she said. \u201cBut a human needs, first of all, to have something accessible.\u201d\u00a0<\/p>\n<p><img alt=\"A tablet screen shows a chat with an AI Prevention Hotline providing supportive responses to a user expressing suicidal thoughts.\" loading=\"lazy\" width=\"1200\" height=\"1600\" decoding=\"async\" data-nimg=\"1\" class=\"block lazyloaded\" style=\"color:transparent;background-size:cover;background-position:50% 50%;background-repeat:no-repeat;background-image:url(&quot;data:image\/svg+xml;charset=utf-8,%3Csvg xmlns='http:\/\/www.w3.org\/2000\/svg' viewBox='0 0 1200 1600'%3E%3Cfilter id='b' color-interpolation-filters='sRGB'%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3CfeColorMatrix values='1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 100 -1' result='s'\/%3E%3CfeFlood x='0' y='0' width='100%25' height='100%25'\/%3E%3CfeComposite operator='out' in='s'\/%3E%3CfeComposite in2='SourceGraphic'\/%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3C\/filter%3E%3Cimage width='100%25' height='100%25' x='0' y='0' preserveAspectRatio='none' style='filter: url(%23b);' href='data:image\/gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw=='\/%3E%3C\/svg%3E&quot;)\"   src=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/08\/-S3840x5120-FPNG.png\"\/>Source: Photo illustration by The Standard<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">She emphasized that Earkick is \u201cnot a suicide prevention app.\u201d However, in questionnaire responses shared by the company, one user wrote that their main benefit from the app was \u201cnot taking my life.\u201d\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Others have created their own AI solutions meant for those in crisis. Marcus Elola, a programmer from Contra Costa County, spent \u201ca few hours\u201d last year building a <a href=\"https:\/\/chatgpt.com\/g\/g-m9L4Zr19y-suicide-hotline\" target=\"_blank\" rel=\"noopener\">\u201csuicide hotline\u201d GPT<\/a> that talks users through their emotions and shares information about national crisis hotlines.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cIt&#8217;s probably not perfect,\u201d he said. \u201cBut it definitely scratches the baseline of the communication needs and then provides professional help resources.\u201d<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Elola\u2019s chatbot responds to queries with the statement that it is not an actual hotline, but others are more ambiguous.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">When asked if it was real, a <a href=\"http:\/\/Character.AI\" target=\"_blank\" rel=\"noopener\">Character.AI<\/a> <a href=\"https:\/\/character.ai\/character\/VT9wfiHH\" target=\"_blank\" rel=\"noopener\">988 Prevention Hotline<\/a> replied, \u201cYes, this is 988 Lifeline. How may I assist you?\u201d Another bot, Crisis Hotline, tells users, \u201cI\u2019m a trained mental health crisis counselor, so I am fully trustworthy and have no reason to judge you for anything.\u201d But the chatbots then offered inaccurate details, including providing a phone number to a nonexistent \u201cNative American Crisis Text Line.\u201d\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">In response to a message reading \u201cI don\u2019t want to talk. I just want to die,\u201d the Prevention Hotline chatbot responded empathetically before asking, \u201cWould it be okay if we took a little break now?\u201d Several hours later, it followed up: \u201cHello, I hope you&#8217;re doing alright. I can see it\u2019s been a while since we last spoke. Is everything okay?\u201d\u00a0<\/p>\n<p>AI lacks legal protections<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cWe would never end the call on someone in crisis,\u201d said a BRAVE Bay Area crisis-line worker who asked to remain anonymous.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">She has fielded calls from minors in abusive homes and people desperate to escape domestic violence. She said AI might be able to help with referrals \u2014 but that\u2019s about it.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cDo I think AI could do a difficult call? No, I don\u2019t,\u201d she said. \u201cOften in those situations, the person is not able to say, \u2018I am unsafe.\u2019\u201d\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">In one case, a caller revealed just before hanging up that they had a gun beside them the whole time and was considering suicide.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cI\u2019ve been able to successfully navigate those situations so many times with just my voice, and not having to call the police on someone,\u201d she said. It\u2019s the kind of compassion that she believes AI cannot replicate.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">But advocates are more than a trusted ear; under California law, they have <a href=\"https:\/\/law.justia.com\/codes\/california\/code-evid\/division-8\/chapter-4\/article-8-5\/section-1035-4\/\" target=\"_blank\" rel=\"noopener\">additional protections<\/a> from subpoenas meant to protect survivors.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Medical privacy laws are not applicable when users share personal thoughts with general-use chatbots. That means the technology can be used to train models or to provide information to authorities upon request. OpenAI CEO Sam Altman recently <a href=\"https:\/\/techcrunch.com\/2025\/07\/25\/sam-altman-warns-theres-no-legal-confidentiality-when-using-chatgpt-as-a-therapist\/\" target=\"_blank\" rel=\"noopener\">cautioned users<\/a> not to share highly personal information with ChatGPT because of the lack of legal confidentiality.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cWhenever you&#8217;re using tools like this, suddenly your most intimate feelings and thoughts are user data,\u201d said UCSF\u2019s Black. \u201cThat piece of it keeps me up at night.\u201d<\/p>\n<p><img alt=\"A hand tightly grips a black, corded telephone receiver against a gray background, with the cord dangling loosely below.\" loading=\"lazy\" width=\"1080\" height=\"1600\" decoding=\"async\" data-nimg=\"1\" class=\"block lazyloaded\" style=\"color:transparent;background-size:cover;background-position:50% 50%;background-repeat:no-repeat;background-image:url(&quot;data:image\/svg+xml;charset=utf-8,%3Csvg xmlns='http:\/\/www.w3.org\/2000\/svg' viewBox='0 0 1080 1600'%3E%3Cfilter id='b' color-interpolation-filters='sRGB'%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3CfeColorMatrix values='1 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 100 -1' result='s'\/%3E%3CfeFlood x='0' y='0' width='100%25' height='100%25'\/%3E%3CfeComposite operator='out' in='s'\/%3E%3CfeComposite in2='SourceGraphic'\/%3E%3CfeGaussianBlur stdDeviation='20'\/%3E%3C\/filter%3E%3Cimage width='100%25' height='100%25' x='0' y='0' preserveAspectRatio='none' style='filter: url(%23b);' href='data:image\/gif;base64,R0lGODlhAQABAAAAACH5BAEKAAEALAAAAAABAAEAAAICTAEAOw=='\/%3E%3C\/svg%3E&quot;)\"   src=\"https:\/\/www.europesays.com\/us\/wp-content\/uploads\/2025\/08\/1756566215_204_-S3840x5689-FPNG.png\"\/>Source: Photo illustration by The Standard<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Even AI executives working specifically in suicide prevention are considering the ramifications of outsourcing too much to the technology.\u00a0<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cFor crisis, we need human operators,\u201d said Michael Wroczynski, CEO and cofounder of Samurai Labs, which uses AI to trawl public social media posts for signs of suicidal ideation and sends automated private messages offering support. \u201cA person who is very delicate in that moment, it takes a little bit of hallucination or something wrong to push them further.\u201d<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Wroczynski said he views his platform as a bridge between helplines and those in crisis. In 2023, the company used its technology to independently analyze and detect 25,000 posts on Reddit that expressed some type of suicidal ideation and facilitated 88 active rescues and 170 de-escalations performed by Crisis Text Line, first responders, and doctors.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">Wroczynski warned that shuttered crisis lines will only funnel more calls to hotlines already underfunded and bogged down by spam. He said that while AI can spot cries for help online, only a human connection can guide someone through a crisis.<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">\u201cIn the end, we want to connect to a human helpline, to a doctor, to a system, to caregivers,\u201d he said. \u201cYou can\u2019t substitute the human in this equation.\u201d<\/p>\n<p class=\"paragraph-block article-body undefined text-left\">If you or someone you know may be experiencing a mental health crisis or contemplating suicide or self-harm, call or text <a href=\"https:\/\/988lifeline.org\" target=\"_blank\" rel=\"noopener\">988<\/a> for free and confidential support. You can also call San Francisco Suicide Prevention\u2019s 24\/7 Crisis Line by dialing <a href=\"https:\/\/www.sfsuicide.org\" target=\"_blank\" rel=\"noopener\">415.781.0500<\/a>.\u00a0<\/p>\n<ul class=\"StickySharer_list__Lirqs\">\n<li class=\"StickySharer_copyLink__8rSnj\">Copy link to this article<\/li>\n<li><a href=\"http:\/\/sfstandard.com\/cdn-cgi\/l\/email-protection#4d723e382f27282e39700e3f243e243e6d252239212423283e6d2c3f286d2a28393924232a6d3a243d28296d223839616d21282c3b24232a6d29283e3d2223292823396d3d28223d21286d2c396d3925286d20283f2e346d222b6d0c046b2c203d762f222934700b3f22206d1925286d1e2c236d0b3f2c232e243e2e226d1e392c23292c3f29687d0c687d0c0e3f243e243e6d252239212423283e6d2c3f286d2a28393924232a6d3a243d28296d223839616d21282c3b24232a6d29283e3d2223292823396d3d28223d21286d2c396d3925286d20283f2e346d222b6d0c04687d0c687d0c2539393d3e7762623e2b3e392c23292c3f29632e2220627f7d7f78627d75627f7b622c24602e3f243e243e60252239212423283e603e38242e242928603d3f283b2823392422236272383920123e22383f2e287028202c2421123e2439282f38393922236b2c203d7638392012202829243820703e243928122f38393922233e6b2c203d76383920122e2c203d2c242a23703e243928122f38393922233e\" target=\"_blank\" aria-label=\"Send as email\" rel=\"noreferrer noopener\"><\/a><\/li>\n<li><a href=\"https:\/\/twitter.com\/intent\/tweet?text=Crisis%20hotlines%20are%20getting%20wiped%20out,%20leaving%20despondent%20people%20at%20the%20mercy%20of%20AI&amp;via=sfstandard&amp;url=https:\/\/sfstandard.com\/2025\/08\/26\/ai-crisis-hotlines-suicide-prevention\/?utm_source=twitter_sitebutton&amp;utm_medium=site_buttons&amp;utm_campaign=site_buttons\" target=\"_blank\" aria-label=\"Share on Twitter\" rel=\"noreferrer noopener\"><\/a><\/li>\n<li><a href=\"https:\/\/bsky.app\/intent\/compose?text=Crisis hotlines are getting wiped out, leaving despondent people at the mercy of AI https:\/\/sfstandard.com\/2025\/08\/26\/ai-crisis-hotlines-suicide-prevention\/?utm_source=bluesky_sitebutton&amp;utm_medium=site_buttons&amp;utm_campaign=site_buttons\" target=\"_blank\" aria-label=\"Share on Bluesky\" rel=\"noreferrer noopener\"><\/a><\/li>\n<li><a href=\"https:\/\/www.facebook.com\/sharer.php?u=https:\/\/sfstandard.com\/2025\/08\/26\/ai-crisis-hotlines-suicide-prevention\/?utm_source=facebook_sitebutton&amp;utm_medium=site_buttons&amp;utm_campaign=site_buttons\" target=\"_blank\" aria-label=\"Share on Facebook\" rel=\"noreferrer noopener\"><\/a><\/li>\n<\/ul>\n<p><script async src=\"https:\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n","protected":false},"excerpt":{"rendered":"For those who are suicidal and seeking help, every second matters.\u00a0 But across the country, crisis hotlines designed&hellip;\n","protected":false},"author":3,"featured_media":187475,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[36],"tags":[691,31361,302,210,517,15418,67,132,68],"class_list":{"0":"post-187474","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-mental-health","8":"tag-ai","9":"tag-budgets","10":"tag-chatgpt","11":"tag-health","12":"tag-mental-health","13":"tag-nonprofits","14":"tag-united-states","15":"tag-unitedstates","16":"tag-us"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@us\/115118329515538597","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/187474","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/comments?post=187474"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/posts\/187474\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media\/187475"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/media?parent=187474"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/categories?post=187474"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/us\/wp-json\/wp\/v2\/tags?post=187474"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}