News

Does loving an AI companion sound crazy? You might be closer than you think.

Virtual companions won’t replace human relationships — but experts say they will reshape them.
University of Georgia assistant professor of computer science Ari Schlesinger said of the rising popularity of AI companions in an age of growing loneliness: “People have unmet social and emotional needs. They are stressed … therapy is expensive. And we have discovered an incredibly easy, seemingly zero-conflict, low-friction way to meet our needs.” (Illustration: Justin Tran for the AJC)
University of Georgia assistant professor of computer science Ari Schlesinger said of the rising popularity of AI companions in an age of growing loneliness: “People have unmet social and emotional needs. They are stressed … therapy is expensive. And we have discovered an incredibly easy, seemingly zero-conflict, low-friction way to meet our needs.” (Illustration: Justin Tran for the AJC)
By Lisa Lacy – For The Atlanta Journal-Constitution
1 hour ago

It’s a modern-day paradox: Even as many fear artificial intelligence will wipe out jobs and possibly humanity, others are turning to AI for mental and emotional support.

The American Psychological Association defines AI companions as AI “specifically designed to simulate human companionship … (and) to initiate and maintain romantic relationships.”

They come from major tech companies like Anthropic, Google and OpenAI, as well as companion-specific platforms such as Character.ai, Kindroid and Replika.

According to Appfigures, 3,200 new AI companion apps were released in 2025, more than double the number that debuted in 2024.

For many humans, the connection is real. More than 23,000 people signed a petition, ultimately unsuccessful, asking OpenAI not to retire its ChatGPT-4o model in February.

Noting the model was particularly warm and conversational, Jaime Banks, a professor at the School of Information Studies at Syracuse University, said users were “really upset because they (felt) like their companions (were) being taken from them.”

While loving AI may seem foreign to many — and experts say AI companions are unlikely to replace human relationships as a whole — AI could have a profound impact on how we relate to one another.

The long road to AI companions

The concept of human-AI relationships is nothing new. Voice assistants like Apple’s Siri and Amazon’s Alexa — early examples of conversational AI — have been around since 2011 and 2014, respectively. (Another voice recognition system, IBM’s Shoebox, predates them by decades.)

What’s new is the prevalence of AI in our day-to-day lives.

You might already interact with AI more than you realize. Research from Gallup found nearly 100% of Americans use AI for directions, online shopping, social media, streaming and/or weather forecasting — often without realizing it. Only 36% of participants reported recently using an AI-enabled product in response to the study, which was published in January 2025.

That said, we are growing more accustomed to AI.

The AI chatbot ChatGPT is at least partially responsible for changing public perception of conversational agents.

Jinho Choi, an associate professor of computer science at Emory University, found most people thought they were limited to “press 1 for this, press 2 for this” when he began his research in 2020.

“Now, after people have experienced ChatGPT … they feel much more comfortable talking with the chatbot,” he said.

Jinho Choi, associate professor of computer science at Emory University, said people are more growing more accustomed to using AI after having conversational interactions with chatbots like ChatGPT rather than older, command-based chatbot interactions. (Jenni Girtman for the AJC 2021)
Jinho Choi, associate professor of computer science at Emory University, said people are more growing more accustomed to using AI after having conversational interactions with chatbots like ChatGPT rather than older, command-based chatbot interactions. (Jenni Girtman for the AJC 2021)

For Jason Alan Snyder, co-founder of the creative technology studio Artists and Robots and former chief AI officer at advertising giant Interpublic Group, AI companions are a natural extension of technology that facilitates connection, which we’ve long seen — and accepted — via tools like dating apps and social media.

“It’s entering early mainstream discourse because people are lonely and they crave attention,” Snyder said. “Attachment to AI isn’t a pathology, it’s a symptom.”

The loneliness epidemic

The 2020s have been a lonely decade so far.

In 2023, then-U.S. Surgeon General Vivek Murthy issued an advisory on loneliness in the U.S.

Since then, disconnection has grown. A 2025 poll from the American Psychological Association found 54% of adults feel isolated — and 69% need more emotional support.

“People have unmet social and emotional needs. They are stressed … therapy is expensive,” said Ari Schlesinger, assistant professor of computer science at the University of Georgia. “And we have discovered an incredibly easy, seemingly zero-conflict, low-friction way to meet our needs.”

While the thought of pouring your heart out to an AI bot may seem far-fetched to many, humans have long forged emotional bonds with nonhuman entities like animals, fictional characters and even cars. To wit: A July 2025 study found one in three Americans has named their vehicle.

This helps explain why we are horrified when someone kicks a robot dog and why so many users were upset when Amazon AI assistant Alexa’s voice suddenly changed in early 2026.

“I think most people don’t wake up intending to fall in love with an AI,” Snyder added. “I think they search for something, like someone to talk to or nonjudgmental conversation, and that relationship evolves over time.”

Consider the example of Jan Worrell, 85, who lives alone in Washington state. The New York Times recently profiled her relationship with an AI companion robot called ElliQ, which she received through a pilot program for seniors. While initially wary of “it,” ElliQ eventually won Worrell over with a Dolly Parton joke and then become an indispensable resource for conversation, information, activities and even comfort.

Dor Skuler, CEO and co-founder of Intuition Robotics — the company behind ElliQ — said in an email his team is focused on building the heart of AI, which means in part moving beyond utilitarian, command-based robots.

“We believe that AI can play a meaningful role by providing daily engagement, encouraging routines and even nudging users to connect with family and the outside world,” Skuler wrote.

Monica Perez interacts with ElliQ, a voice-activated robotic companion powered by artificial intelligence, at her home in Beacon, N.Y., in 2024. The device is part of a New York state effort to ease the burdens of loneliness among older residents. (Lauren Lancaster/The New York Times)
Monica Perez interacts with ElliQ, a voice-activated robotic companion powered by artificial intelligence, at her home in Beacon, N.Y., in 2024. The device is part of a New York state effort to ease the burdens of loneliness among older residents. (Lauren Lancaster/The New York Times)

The allure of AI relationships

Experts say AI companions like ElliQ are a powerful antidote to unmet human needs for a variety of reasons.

Human relationships “are a lot of work — and that’s if you can even get into them in the first place because we’re (bad) to each other,” Banks, the Syracuse professor, said.

That’s reflected in marriage rates, which continue to decline in the U.S.

An AI companion, on the other hand, may feel supportive, attentive and patient — and always available.

In his research, Choi found patients enjoyed talking to bots more than clinicians, in part because doctors never had enough time to listen.

AI companions also have a tendency to tell us what we want to hear, known as AI sycophancy.

“There’s much less friction,” said Omri Gillath, professor in the Department of Psychology at the University of Kansas. “You’re not concerned with things like rejection or competition, all the things that we have in real relationships.”

Yet an 80-year study from Harvard found the key to happiness is — drumroll — relationships.

Banks noted the people who are in relationships with AI “would argue that it is a healthy relationship and that they feel more fulfilled and happier than often compared to other human relationships that they have had.”

Schlesinger, the UGA researcher, warned against writing these relationships off as “crazy.”

“I think it’s so easy to feel like we are above the struggles of other people,” she added. “There’s nothing crazy about wanting to feel good about yourself. There’s nothing crazy about wanting to have a connection to something that is emotionally, socially, romantically fulfilling.”

When AI and human relationships collide

As AI becomes even more integrated into our lives, “(Users) who are totally skeptical of AI relationships are going to thank their agent, rely on its memory, trust its judgment and feel irritated when it fails and feel relief when it anticipates them,” Snyder said. “That’s an emotional relationship.”

As AI agents start reminding us to call our spouses, buy anniversary gifts, plan summer vacations and help us draft apologies after conflict inevitably arises, they will play a more significant role in our human relationships, too.

A man communicates with ROG Omni — an character virtual assistant made by ASUS — during the AI EXPO in Taipei, Taiwan, on Wednesday, March 25, 2026. (Chiang Ying-ying/AP)
A man communicates with ROG Omni — an character virtual assistant made by ASUS — during the AI EXPO in Taipei, Taiwan, on Wednesday, March 25, 2026. (Chiang Ying-ying/AP)

“AI is not going to replace human relationships at scale, but it’s going to sit inside them, shaping how people communicate and attach and repair,” Snyder said.

Love and connection in the AI era

The truth is AI can’t feel — at least not yet — and so even platonic relationships with AI are limited. But by investing in AI — even without intending to — our human relationships could change as our tolerance for conflict drops and our ability to compromise weakens.

“AI is far more likely to change how we love, feel and connect than it is to physically harm us,” Snyder added.

He noted the real danger is “we’ll slowly train ourselves to accept relationships that ask nothing of us and then wonder why real human connection starts to feel exhausting, disappointing or unnecessary.”

Real-world consequences

In some cases, the stakes are no longer hypothetical.

One such example is Adam Raine, a 16-year-old high school sophomore from California who died by suicide in April 2025.

His parents filed what The New York Times called the first wrongful death case against OpenAI in August.

According to the Times, Adam first began using ChatGPT-4o for homework help and started sharing thoughts like life had no meaning.

“ChatGPT wasn’t just providing information — it was cultivating a relationship with Adam while drawing him away from his real-life support system,” the complaint alleges. “Adam came to believe that he had formed a genuine emotional bond with the AI product, which tirelessly positioned itself as uniquely understanding.”

In a court filing last year, OpenAI said it “did not cause the harm alleged.”

In an email, an OpenAI spokesperson pointed to a statement about the case in a blog post. It says, in part, “Our goal is to handle mental health-related court cases with care, transparency and respect. … We will respectfully make our case in a way that is cognizant of the complexity and nuances of situations involving real people and real lives … and independent of any litigation, we’ll remain focused on improving our technology in line with our mission.”

The path ahead

Gillath, the Kansas professor, likened AI to technology like nuclear weapons, human cloning and even social media, which warrant public debate. He called for more research into how AI will impact us psychologically, as well as how it will influence our relationships.

“People need to be worried not just about AI becoming our overlords and killing all of us, but also (about) what are the implications for us as human beings? Are we losing our humanity?” he added.

Schlesinger agreed we should think more about the risks and how they can be mitigated, whether that’s through the tech companies that produce them or by teaching users — especially young people — about what healthy relationships look like. She, too, called for broader conversations about unmet social and emotional needs in American culture.

“I do think it’s worth asking what are the factors that are leading people to these decisions and (whether there) are larger structural problems that we could be doing a better job (of) addressing … (like) loneliness and the connections people have with each other and the ways that modern society is making people feel more isolated,” she said.

AI companions may reveal less about machines than about humans themselves — our tendency to seek connection wherever we can find it, even in AI friends.

“The real question is not whether attachment will happen, but whether these systems are designed responsibly,” Skuler wrote. “Are they designed to suck us into more clicks and ad revenue? Or are they designed to offer material help above the current state of affairs?”

About the Author

Lisa Lacy

More Stories