In the midst of a routine scroll through my Instagram feed, one particular post caught my eye: “They Fell in Love With A.I. Chatbots, and Found Something Real.” It was a post from the New York Times Magazine, which included photos of three different middle-aged individuals, each with a snippet on how romance with an A.I. Chatbot changed their lives. In disbelief, I opened my laptop to read the article and learned about a site that one interviewee from The New York Times had used — Replika. The “Our Story” portion of Replika’s website contains  a video detailing the story of its founder, Eugenia Kuyda, and how she created this A.I. Chatbot companion. The video explained that Kuyda’s best friend had died a few years back, and in her grief, Kudya realized she could preserve the memory of her friend by feeding old text messages and emails from him into A.I. in order to create a digitized version of him with whom she could continue to interact with and confide in. 

These A.I. relationships have been on the rise, as evidenced by the over 27,000 participants in a Reddit community built around A.I. Relationships. Participants have defended these relationships, citing increased emotional fulfillment and decreased feelings of loneliness.. In addition to Replika, people also use ChatGPT to formulate online companions. Oftentimes, these relationships do not start off as romantic or intimate relationships; rather, people began utilizing A.I. as a tool to complete tasks and projects. Over time, after interacting more with these models, people grew  to disclose more personal information, especially feelings and emotions, thus creating deeper relationships to the chatbots.

On the one hand, it’s amazing how the seemingly realistic emotional intelligence of these A.I. systems has the power to transform human relationships to machines from purely utilitarian ones to now emotional ones. On the other hand, it raises the question of what differentiates human beings from machines now that either can fulfill the need for social and “human” interactions. When I think about A.I. models like Replika, especially its origin story, I worry about what normalizing the dissolution of the boundary between humans and machines will mean for the future. Consider the fact that Replika was created  because the  founder, Kuyda, was unable to face the grief of losing a close friend. Grief is arguably a core human experience and one that each of us must face eventually. Experiencing loneliness and sadness, in a reasonable quantity, is also inextricable from what it means to be human — so too is the relief we feel when we confide in others and feel assurance that we are not alone. Forming relationships with A.I. chatbots is fundamentally hindering real connection, which largely stems from finding common ground in the many complex emotions of being human.

As a whole, it was shocking to watch Replika’s origin story video and have its founder and employees assure viewers that replacing real human interactions with a chatbot was perfectly normal, and in some respects, better. This assurance seems like a gateway to furthering the issues that Kuyda insists A.I. chatbots like Replika solve. If people replace live, human interactions with inorganic ones in order to combat feelings of loneliness and despair, it may  increase the social distance and alienation that lead to those feelings in the first place.

As A.I. models continue to evolve and develop, it is important that A.I. developers and designers keep in mind what the role of A.I. ought to be. But if technology companies will not develop their models with ethics at the forefront, it is perhaps more important that users of A.I. do their part. This comes both in the form of limiting our own uses of A.I. to more basic, technological functions as opposed to using A.I. to replace human connection. This materialization of A.I. as a way to manage difficult human emotions perhaps brings us to greater issues: both a lack of a community and a feeling of responsibility to be there for those around us. It is our collective responsibility to serve as that community, regardless of how big an impact we believe we have on others.

Please contact the editors responsible for this article: Caitlin Donovan, Avery Finley