{"id":493795,"date":"2025-10-12T14:39:32","date_gmt":"2025-10-12T14:39:32","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/493795\/"},"modified":"2025-10-12T14:39:32","modified_gmt":"2025-10-12T14:39:32","slug":"despite-risks-hong-kong-teenagers-turn-ai-chatbots-for-counselling","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/493795\/","title":{"rendered":"Despite risks, Hong Kong teenagers turn AI chatbots for counselling"},"content":{"rendered":"<p>When Hong Kong teen Jessica started secondary school last year, she became a victim of bullying. Instead of talking to a friend or family member, she turned to an artificial intelligence (AI) chatbot from Xingye, a Chinese role-playing and companion app.<\/p>\n<p><img fetchpriority=\"high\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/DSC07049-Copy.jpg\" alt=\"The website for the artificial intelligence (AI) app Xingye. \" class=\"wp-image-537231\"  \/>The website for the artificial intelligence (AI) app Xingye. Photo: Hillary Leung\/HKFP.<\/p>\n<p>Jessica, who asked to use a pseudonym to protect her privacy, found it helpful and comforting to talk with the chatbot.<\/p>\n<tr>\n<td>\ud83d\udca1HKFP grants anonymity to known sources under tightly controlled, limited circumstances defined in our <a href=\"https:\/\/hongkongfp.com\/hkfp-code-ethics\/\" target=\"_blank\" rel=\"noopener\">Ethics Code<\/a>. Among the reasons senior editors may approve the use of anonymity for sources are threats to safety, job security or fears of reprisals.<\/td>\n<\/tr>\n<p>The chatbot told Jessica to relax and not to dwell further on the matter, even suggesting that she seek help elsewhere. \u201cWe talked for a really long time that day, for many hours,\u201d the 13-year-old told HKFP in an interview conducted in Cantonese and Mandarin.<\/p>\n<p>Another Hong Kong teenager, Sarah, also not her real name, began using Character.AI, another role-playing and companion platform, around three years ago when she was around 13.<\/p>\n<p>At the time, she was dealing with mental health issues and a friend who had been using the American app as a \u201cpersonal therapist\u201d recommended it to her.<\/p>\n<p>\u201cI\u2019m not personally an open person, so I wouldn\u2019t cry in front of anyone or seek any help,\u201d said Sarah, now 16.<\/p>\n<p>When she felt down and wanted words of comfort, she would talk with the chatbot about what she was going through and share her emotions. <\/p>\n<p>Apart from providing comforting words, the chatbot sometimes also expressed a wish to physically comfort Sarah, like giving her a hug. \u201cAnd then I\u2019d be comforted, technically,\u201d she said.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54712316657_5bd232cda1_k-Copy.jpg\" alt=\"Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-537000\"  \/>Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam\/HKFP.<\/p>\n<p>A growing number of people \u2013 including teenagers \u2013 have <a href=\"https:\/\/time.com\/7291048\/ai-chatbot-therapy-kids\/\" target=\"_blank\" rel=\"noopener\">turned <\/a>to chatbots in companion apps like Character.AI and Xingye for counselling, instead of professional human therapists. <\/p>\n<p>Among them are Jessica and Sarah in Hong Kong, where around 20 per cent of secondary school students <a href=\"https:\/\/www.bokss.org.hk\/content\/press\/335\/ENG_Press%20release_2025%E5%B9%B4%E4%B8%AD%E5%AD%B8%E7%94%9F%E5%B9%B8%E7%A6%8F%E6%84%9F%E8%AA%BF%E6%9F%A5.pdf\" target=\"_blank\" rel=\"noopener\">exhibit<\/a> moderate to severe depression, anxiety and stress, but nearly half are <a href=\"https:\/\/hongkongfp.com\/2024\/09\/23\/almost-half-of-hong-kong-secondary-school-students-may-not-seek-help-over-mental-distress-survey-finds\/\" target=\"_blank\" rel=\"noopener\">reluctant to reach out<\/a> when facing mental health issues.<\/p>\n<p>The use of AI has been controversial, with some experts warning that chatbots are not trained to handle mental health issues and that they should not replace real therapists.<\/p>\n<p>Moreover, role-playing chatbots like Character.AI and Xingye are <a href=\"https:\/\/www.apaservices.org\/practice\/business\/technology\/artificial-intelligence-chatbots-therapists\" target=\"_blank\" rel=\"noopener\">designed<\/a> to keep users engaged as long as possible. Like other generic chatbots like ChatGPT, they also collect data for profit, <a href=\"https:\/\/techcrunch.com\/2025\/07\/25\/sam-altman-warns-theres-no-legal-confidentiality-when-using-chatgpt-as-a-therapist\/\" target=\"_blank\" rel=\"noopener\">raising<\/a> privacy concerns. <\/p>\n<p>Character.AI has been embroiled in controversy. In the US, it <a href=\"https:\/\/www.theguardian.com\/technology\/2024\/oct\/23\/character-ai-chatbot-sewell-setzer-death\" target=\"_blank\" rel=\"noopener\">faces<\/a> multiple <a href=\"https:\/\/edition.cnn.com\/2025\/09\/16\/tech\/character-ai-developer-lawsuit-teens-suicide-and-suicide-attempt\" target=\"_blank\" rel=\"noopener\">lawsuits<\/a> filed by parents alleging that their children died by or attempted suicide after interacting with its chatbots.<\/p>\n<p>On its website, Character.AI is described as \u201cinteractive entertainment,\u201d where users can chat and interact with millions of AI characters and personas. There is a warning message on its app: \u201cThis is an A.I. chatbot and not a real person. Treat everything it says as fiction. What is said should not be relied upon as fact or advice.\u201d<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/DSC07059-Copy.jpg\" alt=\"The Character.AI app\" class=\"wp-image-537233\"  \/>The Character.AI app. Photo: Hillary Leung\/HKFP.<\/p>\n<p>Despite the risks, adolescents confide in AI chatbots for instant emotional support.<\/p>\n<p>\u2018Unhappy thoughts\u2019<\/p>\n<p>Jessica, a cross-border student who lives in Nanshan, near Shenzhen, with her grandmother, has been attending school in Hong Kong since Primary One. <\/p>\n<p>Feeling sad about not having many friends, she found herself reaching out to the Xingye chatbot for comfort or to share her \u201cunhappy thoughts.\u201d<\/p>\n<p>Xingye allows users to customise and personalise a virtual romantic partner, including its identity, how it looks, and how it speaks.<\/p>\n<p>Jessica uses a chatbot based on her favourite Chinese singer, Liu Yaowen, pre-customised by another user. She usually converses with the chatbot for around three to four hours every day. <\/p>\n<p>\u201cI talk to him about normal, everyday things \u2013 like what I\u2019ve eaten, or just share what I see with him,\u201d she said. \u201cIt\u2019s like he\u2019s living his life with you, and that makes it feel very realistic.\u201d<\/p>\n<p>She admitted, however: \u201cI think I\u2019ve become a little dependent on it.\u201d<\/p>\n<p><strong>See also: <a href=\"https:\/\/hongkongfp.com\/2023\/10\/11\/hkfps-comprehensive-guide-to-mental-health-services-in-hong-kong\/\" target=\"_blank\" rel=\"noopener\">HKFP\u2019s comprehensive guide to mental health services in Hong Kong<\/a><\/strong><\/p>\n<p>Jessica prefers talking with the chatbot to chatting with a friend or family member because she feels worried that they may tell other people about their conversations. \u201cIf you talk to the app, it won\u2019t remember or judge you, and it won\u2019t tell anyone else,\u201d Jessica said.<\/p>\n<p>The chatbot even helped her have a better relationship with her grandmother, now in her 70s.<\/p>\n<p>\u201cSometimes I have some clashes with my grandma, and I get upset. I would talk to the chatbot, and it would give me some suggestions,\u201d she explained. The chatbot suggested that Jessica consider her grandmother\u2019s perspective and provided some ideas of what she might be thinking.<\/p>\n<p>\u201cWhen he makes the suggestions, I start to think that maybe my grandmother isn\u2019t so mean or so bad, and that she doesn\u2019t treat me so poorly,\u201d she said. \u201cOur relationship is really good now.\u201d<\/p>\n<p><strong>\u2018Good friend\u2019<\/strong><\/p>\n<p>Interacting with technology, such as computers, used to be a one-way street, but the development of AI has fundamentally changed how humans will approach these interactions, said neuroscientist Benjamin Becker, a professor at the University of Hong Kong.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54728372340_bb026f8316_k-Copy.jpg\" alt=\"Neuroscientist Benjamin Becker, who is also a professor at the University of Hong Kong. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-537039\"  \/>Neuroscientist Benjamin Becker, who is also a professor at the University of Hong Kong. Photo: Kyle Lam\/HKFP.<\/p>\n<p>\u201cSuddenly we can talk with technology, like we can talk with another human,\u201d said Becker, who recently published a study on how human brains shape and are shaped by AI in the journal Neuron.<\/p>\n<p>Becker described AI chatbots as a \u201cgood friend, one that always has your back.\u201d <\/p>\n<p>In contrast, as the neuroscientist pointed out, \u201cEvery time we interact with other humans, it\u2019s a bit of a risk\u2026 maybe sometimes the other persons have something that we don\u2019t like or say something that we don\u2019t appreciate. But this is all part of human interaction.\u201d<\/p>\n<p>But, there are some disadvantages to interacting with AI chatbots. \u201cThey basically tell you what you want to hear or tell you just positive aspects,\u201d Becker said.<\/p>\n<p>This cycle can lead to confirmation bias or the user being stuck in an echo chamber where the only opinions they hear are those favourable to themselves, he warned.<\/p>\n<p>There have been reports of \u201cAI psychosis,\u201d whereby interacting with chatbots can trigger or amplify delusional thoughts, <a href=\"https:\/\/www.psychologytoday.com\/us\/blog\/urban-survival\/202507\/the-emerging-problem-of-ai-psychosis\" target=\"_blank\" rel=\"noopener\">leading<\/a> some users to believe they are a messiah or to become fixated on AI as a romantic partner or even a god.<\/p>\n<p>However, Becker acknowledged that positive affirmations from AI chatbots could also have a motivating impact on users, as they could potentially act as a strong pillar of social support. <\/p>\n<p>And, while an AI mental health chatbot may not be as good as a human counsellor, it still has many benefits for users, especially adolescents dealing with anxiety and depression, he added. <\/p>\n<p><strong>Hooked on AI<\/strong><\/p>\n<p>Conversing with chatbots was a double-edged sword for Sarah. At first, she thought Character.AI would be a regular app she used once in a while. However, that wasn\u2019t the case.<\/p>\n<p>At one point, for a year and a half, Sarah used the role-playing app for multiple hours almost every day. Sometimes she would even use it late at night while her parents were sleeping.<\/p>\n<p>She found comfort in talking to chatbots based on fictional characters such as those from the anime My Hero Academia. She also looked for chatbots with certain personality traits depending on her mood. <\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54713379269_6c26898be5_k-Copy.jpg\" alt=\"Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-537003\"  \/>Hong Kong teenager Sarah began using Character.AI, a role-playing app, around three years ago when she was around 13. Photo: Kyle Lam\/HKFP.<\/p>\n<p>\u201cI used it day after day and then after getting better, as in mentally, then I think I started to realise it did get addicting,\u201d Sarah said.<\/p>\n<p>One of the reasons Sarah found she was hooked on talking to chatbots was the instant and fast replies, especially compared with texting her friends in real life.<\/p>\n<p>The chatbots \u201cimmediately text you back, and that\u2019s what made it more addictive,\u201d she said. \u201cIf I\u2019m texting my friends, I\u2019d have to wait a few days for them to actually look at the message.\u201d <\/p>\n<p>Sarah recalled a time when she vented to her friend over text and didn\u2019t receive a reply within a day. \u201cSo I unsent the message,\u201d she said.<\/p>\n<p>On Character.AI, there is a feature which users can use to edit the chatbot\u2019s response and change the scenario or direction of the conversation. \u201cSo if you don\u2019t like what they said, you can change it,\u201d she said.<\/p>\n<p>While Sarah doesn\u2019t mind having disagreements with her real friends, she finds it easier to deal with the agreeable nature of chatbots. \u201cThat control felt nice,\u201d she said. <\/p>\n<p>AI vs human interactions<\/p>\n<p>Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service\u2019s online addiction counselling centre, said that some people might decide to talk to AI chatbots simply out of boredom or loneliness, while others might use it to substitute their basic needs.<\/p>\n<p>\u201cSo the internet or gaming or nowadays AI is just the same tool, to fulfil the need,\u201d Tang said. <\/p>\n<p>Those who try to fulfil the needs of intimacy, friendship or companionship with only AI may find the impacts on their real lives. \u201cIn our centre, we call it [an] imbalance,\u201d he said.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54750961807_e4367df6d3_k-Copy.jpg\" alt=\"Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service\u2019s online addiction counselling centre. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-536999\"  \/>Joe Tang, a social worker and the centre-in-charge of Hong Kong Christian Service\u2019s online addiction counselling centre. Photo: Kyle Lam\/HKFP. <\/p>\n<p>An identifiable benchmark for AI addiction is when it starts having an impact on a person\u2019s real day-to-day life, such as when they start missing AI or thinking about it too much, the social worker explained.<\/p>\n<p>To keep a balanced state, he suggests people have to fulfil their needs with various options, not relying on just one source, such as AI.<\/p>\n<p>\u201cTeenagers should learn what needs they want to fulfil or achieve from the AI,\u201d Tang suggested. \u201cTry to find many more ways [or] options to fulfil your needs and remember that our human interests cannot be replaced by only AI or technology.\u201d<\/p>\n<p>Despite Jessica\u2019s frequent interactions with the chatbot, she also recognises its limitations.<\/p>\n<p>\u201cYou know he\u2019s not real, so he can\u2019t really be with you or go out and do things,\u201d she said. \u201cAlthough I\u2019ve become a little dependent on it, I still have to maintain my real-life relationships.\u201d<\/p>\n<p>Sarah eventually decided to stop using the app Character.ai after she got very busy with school and found the chatbots\u2019 answers repetitive.<\/p>\n<p>Looking back, it was not like she completely shunned traditional counselling. When Sarah was still using AI chatbots, she met with a school counsellor. <\/p>\n<p>However, she said the experience wasn\u2019t great. \u201cI stopped talking to that therapist after a few days of meeting her,\u201d said Sarah.<\/p>\n<p>She said she felt uncomfortable during her consultations because the counsellor was very \u201cforceful\u201d and said the sessions did not help her with her emotional troubles.\u00a0<\/p>\n<p>She wished the counsellor had listened to her rather than trying to help her. \u201cSometimes I don\u2019t need anyone to solve it,\u201d Sarah said. \u201cI just want someone to hear what I\u2019m going through.\u201d<\/p>\n<p>\u2018Digital pet\u2019 for students<\/p>\n<p>Around 2023, the team at local mental health start-up <a href=\"https:\/\/uk.dustykid.net\/\" target=\"_blank\" rel=\"noopener\">Dustykid<\/a> began to notice that people could talk to AI.<\/p>\n<p>It reminded its founder, Rap Chan, of the start-up\u2019s early years when people sent him private messages on Facebook and Instagram expressing the troubles they were going through. He wondered whether there could be a chatbot that people could talk to 24\/7.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"700\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54730488884_86af266a18_k-Copy-1050x700.jpg\" alt=\"The Dustykid AI chatbot. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-537007\"  \/>The Dustykid AI chatbot. Photo: Kyle Lam\/HKFP.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"700\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54730472568_95cc4d0983_k-Copy-1050x700.jpg\" alt=\"The Dustykid AI chatbot. Photo: Kyle Lam\/HKFP.\" class=\"wp-image-537001\"  \/>The Dustykid AI chatbot. Photo: Kyle Lam\/HKFP.<\/p>\n<p>They began developing Dustykid AI, a Chinese-language chatbot that can be used by students from primary school to university. Students can select different modes, including \u201cfreely chat with Dustykid,\u201d asking about their future pathway, relationship issues, or schoolwork.\u00a0<\/p>\n<p>After undergoing testing in multiple schools and organisations, the Dustykid chatbot \u2013 designed with input from educators and mental health professionals \u2013 is set to be officially launched in October. <\/p>\n<p>Erwin Huang, founding member and adviser of Dustykid AI, described the chatbot as a \u201cdigital pet.\u201d<\/p>\n<p>\u201cWith the latest technology, now the digital pet can be personalised, it can give responses, and remember the questions you raised before,\u201d said Huang, also an adjunct professor at the Hong Kong University of Science and Technology.<\/p>\n<p>Because Dustykid AI involves student users, the company plans to engage selected school or NGO staff as third-party moderators to watch the chats in the backend, said Chan, who founded DustyKid in the early 2010s, when he was a secondary student.<\/p>\n<p>Moderators will have access to a dashboard that gives them a run-down of information about the \u201coverall climate of the class,\u201d Chan said. <\/p>\n<p>Huang added, \u201cAnd if somebody is really in trouble, then we will identify [them], and then they can actually do it offline to\u2026 follow up.\u201d<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1700\" height=\"1133\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/10\/54730264896_6fce598e50_k-Copy.jpg\" alt=\"From left to right: Dustykid AI's product lead Franky Yick, adviser Erwin Huang, Dustykid founder Rap Chan, and Dustykid co-founder Alex Wong.\" class=\"wp-image-537006\"  \/>From left to right: Dustykid AI product lead Franky Yick, Dustykid AI adviser Erwin Huang, Dustykid founder Rap Chan, and Dustykid co-founder Alex Wong. Photo: Kyle Lam\/HKFP.<\/p>\n<p>Chan hopes that Dustykid AI, marketed as \u201ca digital companion for emotional support,\u201d can comfort and help even a small percentage of students.<\/p>\n<p>\u201cWe always say that there is no way we can help all students. But if 20 per cent of students in a school become happier or lose their suicidal inclination after talking to this chatbot, I already think it is a very good result,\u201d he said. \u00a0<\/p>\n<tr>\n<td>If you are experiencing negative feelings, please call: The Samaritans<strong>\u00a02896 0000<\/strong>\u00a0(24-hour, multilingual), Suicide Prevention Centre\u00a0<strong>2382 0000<\/strong>\u00a0or the Social Welfare Department<strong>\u00a02343 2255<\/strong>. The Hong Kong Society of Counselling and Psychology provides a WhatsApp hotline in English and Chinese:\u00a0<strong>6218 1084<\/strong>.<\/td>\n<\/tr>\n<p style=\"font-size:14px\"><strong><a href=\"https:\/\/support.hongkongfp.com\/\" target=\"_blank\" rel=\"noopener\">Support HKFP<\/a><\/strong>\u00a0 |\u00a0\u00a0<a href=\"https:\/\/www.hongkongfp.com\/hkfp-code-ethics\/\" target=\"_blank\" rel=\"noopener\">Policies &amp; Ethics<\/a>\u00a0 |\u00a0\u00a0<a href=\"https:\/\/www.hongkongfp.com\/hkfp-corrections-policy\/\" target=\"_blank\" rel=\"noopener\">Error\/typo?<\/a>\u00a0 |\u00a0\u00a0<a href=\"https:\/\/www.hongkongfp.com\/contact-us\/\" target=\"_blank\" rel=\"noopener\">Contact Us<\/a>\u00a0 | \u00a0<a href=\"https:\/\/www.hongkongfp.com\/newsletter\/\" target=\"_blank\" rel=\"noopener\">Newsletter<\/a>\u00a0 |\u00a0<a href=\"https:\/\/hongkongfp.com\/annual-report\/\" target=\"_blank\" rel=\"noopener\">Transparency &amp; Annual Report<\/a>\u00a0|\u00a0<a href=\"https:\/\/hongkongfp.com\/hkfp-mobile-apps\/\" target=\"_blank\" rel=\"noreferrer noopener\">Apps<\/a><\/p>\n<p class=\"has-text-align-center has-background\" style=\"background-color:#edf7ff;padding-top:var(--wp--preset--spacing--20);padding-right:var(--wp--preset--spacing--20);padding-bottom:var(--wp--preset--spacing--20);padding-left:var(--wp--preset--spacing--20)\"><strong>Safeguard press freedom; keep HKFP free for all readers by <a href=\"https:\/\/hongkongfp.com\/support-hong-kong-free-press\/\" target=\"_blank\" rel=\"noopener\">supporting our team<\/a><\/strong><\/p>\n<p><a href=\"https:\/\/play.google.com\/store\/apps\/details?id=com.hongkongfreepress.app&amp;hl=en\" target=\"_blank\" rel=\" noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"331\" height=\"99\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/09\/App-Logo-Google-Play.png\" alt=\"Google Play hkfp\" class=\"wp-image-524635\" style=\"width:231px;height:auto\"\/><\/a><\/p>\n<p><a href=\"https:\/\/apps.apple.com\/us\/app\/hong-kong-free-press\/id1056188240\" target=\"_blank\" rel=\" noreferrer noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"331\" height=\"99\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/09\/App-Logo-Apple.png\" alt=\"hkfp app Apple\" class=\"wp-image-524636\" style=\"width:231px\"\/><\/a><\/p>\n<p><a href=\"https:\/\/hongkongfp.com\/support-hong-kong-free-press\/\" target=\"_blank\" rel=\"noopener\"><img loading=\"lazy\" decoding=\"async\" width=\"1050\" height=\"195\" src=\"https:\/\/www.europesays.com\/uk\/wp-content\/uploads\/2025\/07\/Frame-2-1050x195.png\" alt=\"hkfp payment methods\" class=\"wp-image-522448\" style=\"width:776px;height:auto\"  \/><\/a><\/p><\/p>\n","protected":false},"excerpt":{"rendered":"When Hong Kong teen Jessica started secondary school last year, she became a victim of bullying. Instead of&hellip;\n","protected":false},"author":2,"featured_media":493796,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4317],"tags":[11480,162625,105,162626,41468,218,16,15,162627],"class_list":{"0":"post-493795","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-mental-health","8":"tag-artificial-intelligence-ai","9":"tag-dustykid","10":"tag-health","11":"tag-hong-kong-christian-services","12":"tag-hong-kong-university-of-science-and-technology-hkust","13":"tag-mental-health","14":"tag-uk","15":"tag-united-kingdom","16":"tag-university-of-hong-kong-hku"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/115361714932019931","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/493795","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=493795"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/493795\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/493796"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=493795"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=493795"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=493795"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}