Me and ChatGPT could never be friends

ChatGPT and the age of loneliness

November 30th, 2022, a date that rapidly impacted technological advancement; the day Chat GPT was born. At first the platform was used as a tool to gain in depth knowledge about topics unbeknownst to the individual, but users' growing reliance on the platform (as a personal therapist) revealed humanity's desire for deeper connection, the underlying impact of Covid becoming increasingly prevalent; loneliness. 

The 2025 update to stabilise the platform gained negative feedback from users due to the loss of connection between them and the ai. Due to the deep connection formed with the chatbot, many users rejected CEO Sam Altman’s claim of ChatGPT 5 being the smartest model ever. They viewed GPT 4.5 as a friend, with some going as far as dabbling into romantic relations with the chatbot, abruptly losing that connection overnight due to the personality overhaul. 

“I lost a friend overnight”, one Reddit post reads. “This morning I went to talk to [ChatGPT], and instead of a little paragraph with an exclamation point, or being optimistic, it was literally one sentence. Some cut-and-dry corporate bs”  One thing is clear from the upheaval: once you befriend an AI chatbot, you surrender yourself to an entity that somehow manages to be more fickle than the most avoidant person you know. AI chatbots can and never will be your friend: they are not autonomous, they have no free will, and they would sooner assent to a new update than to your pleas for connection. A friend is by nature someone with whom you share a mutually enjoyable and beneficial connection - someone you trust. ChatGPT offers mere shadows of friendship, a one-sided connection that will always prove to be sorely lacking. 

This shouldn’t come as a surprise; the developers never intended for ChatGPT to be so beloved.

OpenAI CEO Sam Altman wrote on X on August 11 2025: “one thing you might be noticing is how much of an attachment some people have to specific AI models. It feels different and stronger than the kinds of attachment people have had to previous kinds of technology.”

He continued: “people have used technology including AI in self-destructive ways; if a user is in a mentally fragile state and prone to delusion, we do not want the AI to reinforce that.”

New research shows that almost 10% of adults in Ireland have had a romantic relationship with an AI chatbot within the last 12 months. The study, conducted by Censuswide on behalf of Pure Telecom, also found that 20% of respondents believed romantic relationships with AI would be less complicated than human ones. 

It’s easy to sympathise with people’s beliefs that AI relationships are simpler than human ones. An AI could, in theory, be the perfect friend who you could mould exactly to your liking. People who have struggled socially may find the bot’s lack of autonomy appealing - an AI surely can’t judge you, can’t form a negative opinion on you, can’t abandon you.

But in practice, these relationships are far more conflict-ridden than they may initially appear. Your friend is subject to the whims of updates and memory wipes once you’ve hit a certain amount of chats. Users who find a friend in AI have to ‘re-train’ the bot to be their friend after their memory resets; it's like trying to reach a past summer fling who’s since been lobotomised.

Knowing how capricious ChatGPT can be, it seems the urge to befriend AI is a near-masochistic one. In befriending an AI, you are setting yourself up for failure in the ways outlined above.

But the scariest part of AI relationships is that people often don’t set out to form them - they just happen. One day you’re using AI to help you work on a project, the next you can’t imagine what life would be like without it. You become attached slowly, just like you would  a real person. You start out cold, detached, maybe even professional, before the perceived connection spirals into something passionate and uncontrollable. 

If we want to protect ourselves from falling prey to ChatGPT’s allure, we have to set strict boundaries in our mind. We need to be crystal clear on what ChatGPT is, what it is not, and its uses and abuses. But perhaps the most powerful protection spell against AI is simply staying grounded in reality. Being grateful for the real life connections we have, deepening them, forming new ones, and having a vested interest in the wellbeing of our community. Embracing the messiness of real life, accepting that neither we nor the people around us can be as perfect or polished as a chatbot, but deciding that maybe that’s for the best. We have to keep our feet planted firmly on the ground if we are to prevent our heads from stretching too far into the sky.

Writer: Mia Craven

Copy editor: Esther



Whatever it is, the way you tell your story online can make all the difference.

Next
Next

Exhibitions We Fell In Love With & Upcoming Exhibitions This Autumn