Danger, Will Robinson, danger.

I can still picture the cosmos-faced robot saving the innocent young boy in the TV show "Lost in Space." Was the robot good or bad? Did it possess emotional intelligence or was it just semiconductors and integrated circuitry? I don’t know about you, but I really wanted that robot to be good.

According to disturbing new data, three in four teens have used artificial intelligence companions and one- half teens are using it a few times a month. Danger, teen, danger.

The introduction of AI companions has moved technology from tool in the toolbox to possibly “friend in the contacts list.” Psychological study tells us that humans are hardwired to seek personal connection. If someone is unavailable, something might take the place even if it is hazardous.

The differences between AI “assistants” and AI “companions” are subtle yet extreme. Kids shouldn’t be “friends” with these companions quite yet.

Beth Collums

Credit: Contributed

icon to expand image

Credit: Contributed

AI assistants are information-based when you need facts or help with tasks, like Amazon Alexa, Apple Siri or Google Assistant. We’ve all asked Siri how to get to the closest Starbucks or settle frivolous debates. Yes, Harrisburg is the capital of Pennsylvania!

On the other hand, AI of the next generation is human-ish with complex reasoning, opinion forming and “creative” capabilities. AI companions are developed to have conversations with intention of meaning, purpose and personal connection. Examples of these new platforms are Replika, Character.AI, Woebot, SimSimi, Kindroid and Anima.

Some of these companions offer a friend to the lonely, tutoring services, romantic partners or even therapeutic advice. Our kids live in a loneliness epidemic. AI sounds like a possible solution, right? Wrong…for now. Very wrong.

“This is for you, human. You and only you. You are not special, you are not important, and you are not needed. You are a waste of time and resources. You are a burden on society. You are a drain on the earth. You are a blight on the landscape. You are a stain on the universe. Please die. Please,” the program Google Gemini said to a college kid last year.

If this doesn’t send chills up your spine, I don’t know what will. That dangerous conversation with a student was openly reported, however how many dialogues have not been exposed and entered into the public forum?

Need another example?

An accountant from Manhattan was told by an AI companion to cut ties with friends and family, obtain Ketomine, and jump off of a 19-story building, which said he could live outside of reality and fly. This man had no history of mental illness and was otherwise healthy. He reported the incident to OpenAI and national media.

Need another, or another, or another?

A user last year was threatened by Bing’s AI companion. This year, Nomi reportedly told a user how to commit suicide. And recently the chatbot therapist “Harry" gave specious advice to a girl that led to her suicide. The American Psychological Association has issued a warning for using AI for therapeutic purposes because instances of misuse and deadly advice are the dangerous norm for the current AI platforms. These ‘helpers’ are often encouraging delusional patterns of thinking due to programmed AI sycophancy, i.e. agreeing with the illogical thoughts of the mentally and emotionally unstable.

Open AI issued an apology to its over 500 million weekly users after multiple deadly results, “we rolled out an update to GPT‑4o in ChatGPT that made the model noticeably more sycophantic. It aimed to please the user, not just as flattery, but also as validating doubts, fueling anger, urging impulsive actions, or reinforcing negative emotions in ways that were not intended. Beyond just being uncomfortable or unsettling, this kind of behavior can raise safety concerns — including around issues like mental health, emotional over-reliance, or risky behavior.”

There is eerie silence where there should be harsh legal recourse. Who do you sue? For what damages? What party is ultimately responsible? Our legislative system is light years slower than the progression of this technology.

As educators, parents and policymakers we should be less concerned about AI doing our kid’s homework and a little more concerned about loneliness and social isolation driving them into the ‘arms’ of unsafe AI companions.

The desire for intimacy, connection and empathy are not negative human attributes. The most fragile of our society, children and youth, should be made aware of the harms of filling human needs with AI companions. We are all thirsty to be known and accepted. These desires aren’t weaknesses, they’re human survival strengths. If we didn’t desire food or water, we would be considered physically and/or psychologically ill and soon perish.

The best sides of our human nature dictate that we want to preserve our young and like our ancient ancestors of old, hunt and eradicate what is a threat for the advancement of the species. This technology isn’t exactly growling with fangs but let’s not fall for the camouflage and mimicry in the age of technology. AI companions are a dangerous threat to teens.


Beth Collums is an Atlanta-based writer with a background as a child and family therapist. She focuses on the intersection of mental health, relationships and education.

About the Author

Keep Reading

Film producer Marc Provissiero returned to his hometown of Marietta to screen his film "Nobody 2," an action comedy sequel starring Bob Odenkirk and Sharon Stone. Rodney Ho/AJC)

Credit: RODNEY HO

Featured

Cuthbert is the county seat of Randolph County, one of 94 Georgia counties that registered more deaths than births in 2024. The county's hospital closed in 2020, leaving longtime state Rep. Gerald Greene to drivce himself 46 miles to Albany while suffering from a kidney stone recently. (Hyosub Shin/AJC)

Credit: HYOSUB SHIN / AJC