Chatbot users are being tricked into thinking artificial intelligence services are their friends, an expert has warned.
Alexander Laffer, a lecturer in media and communications at Winchester University, said there needs to be responsible development of AI, as systems are designed to respond to users’ moods.
He warned that chatbots should be programmed to ‘augment’ social interactions – but not replace them – following cases where people have become too ‘fond or reliant’ on AI companions.
Mr Laffer said this has led to cases such as that of Jaswant Singh Chail, who was jailed for nine years for treason after climbing into the grounds of Windsor Castle in 2021 with a crossbow having conversed with a chatbot called Sarai about planning an attack on the then Queen.
Mr Laffer, who co-authored the study ‘On Manipulation By Emotional AI’ published by Frontiers of Sociology, said: ‘AI doesn’t care, it can’t care.
‘Children, people with mental health conditions and even someone who’s just had a bad day are vulnerable.’
Protections suggested by Mr Laffer include the use of disclaimers on every chat to remind users that the AI companion is not a real person.
Research earlier this year suggested Britain’s loneliness epidemic was fuelling a rise in people creating virtual ‘partners’.
Chatbot users are being tricked into thinking artificial intelligence services are their friends, an expert has warned (file image)
Research earlier this year suggested Britain’s loneliness epidemic was fuelling a rise in people creating virtual ‘partners’ (file image)
Alexander Laffer (pictured) said there needs to be responsible development of AI, as systems are designed to respond to users’ moods
The Institute for Public Policy Research (IPPR) reported almost one million people were using Character.AI or Replika chatbots – two of a growing number of ‘companion’ platforms for virtual conversations.
These sites let users create tailor-made virtual companions who could stage conversations and even share images.
Explicit conversations were allowed on some, while Character.AI hosted AI personas created by other users featuring roleplays of abusive relationships: one, called ‘Abusive Boyfriend’, had hosted 67.2million chats with users as of February this year.
Another, with 148.1million chats under its belt at the time, was described as a ‘Mafia bf (boyfriend)’ who is ‘rude’ and ‘over-protective’.
The IPPR warned that while these companion apps, which exploded in popularity during the pandemic, could provide emotional support they also carried risks of addiction and created unrealistic expectations in real-world relationships.
Share or comment on this article:
Urgent warning to Brits and parents over chatbot ‘friends’