Stay informed with free updates
Simply sign up to the Artificial intelligence myFT Digest — delivered directly to your inbox.
In our busy societies, holidays can be rare oases: sources of plentiful free time that is so elusive the rest of the year. Time, that is to say, available for what we truly want to do and not just what we have to do.
How someone stewards their free time is as good a window into a human soul as anything. And what most people seek when not constrained by duties is companionship, whether they realise this or not (in both senses of that word). Companionship — not just company — is what Christmas and other holidays should replenish us with. But if the talk of a loneliness epidemic is anything to go by, we are becoming strikingly bad at achieving it.
The emotional struggle for companionship — a struggle of the heart, which we increasingly seem to be losing — could well be rooted in a struggle of the mind, an ever weaker ability to grasp what companionship even means. If that’s the case, one need not look far for a culprit.
Those who remember the world before Facebook may also remember how the social network subtly usurped the meaning of the word “friend”. The arrival of “friending” infected what the word had used to mean, and not for the better. And when you don’t have a word for the real thing you want, no wonder it becomes harder to find.
Lexical degradation was quickly matched by the erosion of actual social ties. There is good reason to blame social media in particular and screen-based digital contact in general for undermining social competence. “Friending” crowding out friendship, “connections” isolating all of us even more. And now technology has come for “companionship” itself.
In 2025, “AI companions” went mainstream (including one called, yes, “Friend”) as the answer to our apparently unfulfilled desire for companionship. None other than Mark Zuckerberg (him again) identified the gap between people’s reported number of actual and desired friends as the new business case for AI. The same industry whose previous products deprived us of companionship is proposing a new one to fill the void. As an antidote, try to take seriously what friendship, connection and companionship really require.
There are innocuous enough uses of AI “companions”: role-playing, keeping alive memories of lost loved ones, even therapy and advice. In other words: games, ghosts and God — these are just contemporary versions of humanity’s eternal displacement activities.
They do not, however, bring companionship. And one reason is the “companions’” very usefulness. Michel de Montaigne had it right: companionship’s only value is itself. Seek out company because of its usefulness to you, and it is not companionship you will be getting. When asked what makes a true friendship, Montaigne said: “because it was he; because it was me”. To conceive of friends as something to seek an optimal number of is to ignore what a friend is.
Can one even say “it was he” (or she) about an AI companion? The allure is the opposite: of being so closely attuned to its users that it “gets” them in a way nobody else can (no one can match an always-listening surveillance device). A mirror, then, not an other. The ideal, perhaps, is something like a daemon in Philip Pullman’s literary fantasy world: a separately embodied part of a human’s soul. But daemons are not companions: mirroring a soul brings nothing that wasn’t already there. “Because it was me; because it was me” doesn’t work.
If AI companions do bring something of their own, it is because they are engineered to do so by their makers. They have been designed, after all, to be useful and to please. Why would anyone pay for them otherwise?
The point of AI companions is to take away bad feelings — of alienation, inadequacy, doubt and all the things that make us fear actual company. They promise the pleasures of society without the risk of being with others. AI “friends” are to companionship what pornography is to sexual intimacy. Both are solipsism masquerading as interaction. That, surely, is the root of AI companions’ creepiness.
It is also why they must fail. Companions without internal lives of their own — “happy slaves” — and friendship without friction cannot address solipsism, only flatter us into believing our self-absorption is something else. There are paradoxes we cannot engineer our way out of: that solipsism can only end with your recognition and embrace of your aloneness. Being appreciated by those who were not purpose-made to appreciate you is a start.
So this holiday season, see people, but aim at nothing. Be kind to yourself, and be kind to others. And try to leave your phone at home.