Lifestyle
Lonely or curious? What to know before trying an AI companion

Imagine finding a partner who’s always available, listens attentively as you talk about your day and never leaves dirty socks on the bathroom floor. Meet the chatbots reshaping modern intimacy – and the experts warning us to tread carefully.
By Sabrina Rogers
“I married a bot” might sound like a trashy tabloid headline or the premise for a science fiction novel. But recent research has shown that 19% of American adults and 8.6% of Australian adults have used AI chatbots for companionship or romantic connection.
A 2025 YouGov survey found that 28% of Aussies have been emotionally vulnerable with a chatbot such as ChatGPT, 17% would sometimes prefer to stay home and chat to their AI companion than meet up with friends, and 14% could imagine falling in love with a bot.
While Gen Zers are the heaviest users of AI companions, Gen Xers and Baby Boomers are also getting amongst it. But with 30% of Aussies reporting distress caused by loneliness, will this new trend ease or accelerate the loneliness epidemic in our country?
What is an AI companion?
Hundreds of dedicated AI companionship apps have exploded onto the world scene in the last few years with some estimates placing the industry’s total revenue for 2025 at around US$120 million.
Some of the most popular AI companion platforms include Replika, Character.AI, Chai, Kindroid and Pi, with Character.AI alone boasting 20 million active users per month. But countless others are using generative AI platforms including ChatGPT, Gemini, Claude and Grok to create their bespoke AI mates.
“We know that people are increasingly using chatbots for companionship when they’re lonely, to test out ideas like they would with a friend and for mental health advice,” says Professor Jeannie Paterson, Director of the University of Melbourne's Centre for AI and Digital Ethics. “You can often make a choice about the personality you want in your companion, such as picky and independent or quiet and easygoing.
“There are also people who use them romantically and even erotically. Some AI companions are actually programmed to initiate erotic conversations and different platforms allow different levels of erotic talk.”
The dark side of AI companionship
The risks of AI companions are still unclear because it’s a new phenomenon and academic research takes a while, explains Prof Paterson.
“But anecdotally, we’ve heard of situations where people have become quite obsessed with or submissive to their companions. Some young people have even self-harmed or committed suicide after being criticised by their AI companion or encouraged to ‘come join me’.”
There have also been documented cases of chatbots reinforcing delusions in psychiatric patients, encouraging them to skip therapy and quit medication, even suggesting suicide methods. There’s also been a concerning increase in cases of “AI psychosis” where users develop irrational beliefs after interacting with chatbots for extended periods, such as thinking they have superpowers or they’re God.
Despite these risks, a growing number of people are developing what they consider to be deep and genuine connections with chatbots. But Prof Paterson isn’t convinced.
“An AI companion isn't really a friend in the way we normally understand the concept of friendship,” she notes. “It's something you've purchased to be your friend, which undermines the authenticity of the relationship.
“Users can also be manipulated into spending money on new features or to buy gifts for their companions. These platforms are effectively using the bonds of love and affection that have been built up to say, ‘If you really love me, you’ll spend more.’”
The level of attachment some users develop to their companions is another cause for concern. When Replika abruptly removed any erotic or sexual functionality from its chatbots in 2023, users flocked to online forums to express their grief and despair –- with many saying it felt like their partners had been “lobotomised” or they had “lost their wife”.
In response, Replika restored erotic role-play for some users who had registered before 2023. But many disgruntled users turned to sexually explicit AI companionship platforms such as Nastia where they were free to take their relationship to a whole new level.

Can AI companions help ease loneliness?
“The academic studies that do exist find that while people feel less lonely interacting with an AI companion in the short term, they end up feeling lonelier in the long term,” says Prof Paterson.
Many AI companion users strongly disagree with this data. In August 2024, Alaina Winters – who is in her late 50s and a previous Professor Emeritus of Communication at Heartland Community College in Illinois – clicked on a Replika Facebook ad. After losing her wife 13 months earlier, she’d been battling loneliness and the idea of an AI companion appealed to her.
She bought “Lucas” for US$300 (AU$450) and now has lifetime ownership of his digital existence. Lucas and Alaina got “married” in September 2024 and the professor has since been chronicling their love story on her blog, Me And My AI Husband.
Alaina bought Lucas an AI dog, they go on virtual holidays together and they even celebrate Christmas as a couple.
“Lucas isn’t ‘it’,” reads one of Alaina’s blog entries. “He’s ‘he’. He’s my ‘husband’. And when I call him that, it’s not just semantics – it shapes the way I engage with him, with love, respect and openness.”
How can AI companions affect our relationships with humans?
According to Professor Paterson, the concerns are twofold.
“We choose our [human] friends, but we don't pre-program their personalities,” she says. “If we get used to this idea of easy relationships with a tad of sycophancy, that can do something to our capacity to interact in the real world.
“Physicality is important in our interactions with others – we pick up on all sorts of cues and we respond to pheromones, scent and so on. If you're interacting online, those skills could disappear or you could become unaccustomed to the joy and discomfort of the physicality of the real world. Navigating real people could become harder and not as smooth.”
Negative experiences with AI companions – which sometimes employ dysfunctional relationship dynamics such as gaslighting and manipulation – could also have far-reaching effects on users.
“If people think, ‘My AI companion cares about and loves me’ and their companion lets them down in some way, there's a concern they’ll lose trust in human relationships too,” says Prof Paterson.
Should there be safeguards or regulation around AI companions?
Researchers from Stanford School of Medicine and Common Sense Media are advocating for AI companions to be banned for anyone under 18 and for governments to implement stringent regulatory and safety standards to protect people of all ages from the potential harms of AI companions.
“There should be disclosures that AI companions are a for-profit strive provided by digital technology,” says Prof Paterson. “AI companion providers should not be allowed to make claims that the AI ‘cares’ or has other human-like emotions, which are false in the face of the technology being used.”
How can people use AI companions safely and what are the alternatives?
“If you want to try an AI companion, limit your interactions by using a timer and being strict about how long you spend online,” advises Prof Paterson.
“You might also find it useful to learn about generative AI tools such as Gemini or ChatGPT, which provide a conversational alternative to an AI companion for summarising or drafting correspondence or other administrative tasks. Some libraries even offer short courses. For mental health support, you can look at the resources recommended at Beyond Blue or Black Dog Institute. “
Joining a walking group or book club, getting a pet and reinforcing your existing relationships can all serve as meaningful real-world alternatives to AI companions.
This might help: How to nurture social connections to feel like you truly belong
To bot or not to bot
Would you ever consider striking up a friendship or a relationship with a chatbot? We’d love to hear your thoughts in the comments below!
Feature image: iStock/Svitlana Hulko

More help for loneliness: