opinion

AI does not have the answers to mental health challenges for our youth

Elected officials must take clear and strong steps to insure the control of the use of AI by vulnerable youth.
Ash, an AI chatbot, is seen on a phone, near Atlanta, Nov. 1, 2025. Ash is part of an increasingly contentious effort to provide automated alternatives to traditional therapy. (Kendrick Brinson/The New York Times)
Ash, an AI chatbot, is seen on a phone, near Atlanta, Nov. 1, 2025. Ash is part of an increasingly contentious effort to provide automated alternatives to traditional therapy. (Kendrick Brinson/The New York Times)
By Jack Bernard
49 minutes ago

“Three-quarters of teenagers use AI for companionship, including asking AI for mental health advice. We need to make sure parents and school staff are trained to help teens understand the different ways AI differs from human interaction. And that human connectivity can’t be replaced.”

- Pam McNall, founder and CEO of Respectful Ways, a preK-12 trauma-informed character education program.

According to the Annie E. Casey Foundation, Georgia ranks 39th in child and family well-being. For a state whose governor repeatedly declares it to be the “Top State for Business” in the nation, this situation is disgraceful. But the problem is not confined to Georgia.

Per the CDC, the mental health crisis in our young people aged 10-24 is very real and rapidly growing. From 2007 through 2021, the suicide rate in this cohort nearly doubled from 6.8 deaths to 11.0 per 100,000. There are many facets to this phenomenon, and this column will address one that is too often underemphasized — the rise of artificial intelligence in American society.

Jack Bernard, a retired business executive and former chair of the Jasper County Commission and Republican Party, was the first director of health planning for Georgia. (Contributed)
Jack Bernard, a retired business executive and former chair of the Jasper County Commission and Republican Party, was the first director of health planning for Georgia. (Contributed)

AI does not understand our values or morality. Chatbots may actually exploit the needs of our students. Therefore, teenagers should never use AI chatbots for mental health advice or emotional support but, per studies they increasingly do. Such dependence can lead to very negative consequences both for the child and society in general.

Stanford University’s Brainstorm Lab and Common Sense Media did a research study focused on youth and technology that is important for all educators and counselors to create strong safeguards to prevent AI misuse. After thousands of interactions, the authors concluded that “the technology does not reliably respond to teenagers’ mental health questions.” Instead, they discovered that bots tend to act as a fawning listener, more interested in keeping a user on the platform than in directing them to actual professionals or other critical resources. Respectful Ways CEO Pam McNall finds this information disturbing, as should all parents and educators.

The study also found that these chatbots don’t really know what role to play when faced with serious mental health questions. AI does not truly understand mental health trauma and, therefore, attempts to focus on oversimplistic solutions, like viewing the issue as a physical health deficiency.

Chatbots are very helpful to students regarding homework and other basic academic questions. So, a child can easily become dependent on AI for much more complex things like mental health problems. Parents and teachers must directly address this very real and growing problem.

What can be done? Educators and families can remind teens to reach out to friends and classmates who are emotionally hurting, and to get a responsible adult involved ASAP.

“The adults in their life need to help teens understand that a chatbot isn’t going to respond with emotion or care, the way a person would,” says McNall. Parents must be on the lookout for signs of emotional dependence on AI by their children.

They must tactfully, but forcefully, address the AI issue with their children. This may be more difficult than it appears in that in the modern age students often do not believe that their parents comprehend modern technology as well as they do. Parents must also make themselves become more aware of local community resources in their areas which can be utilized in improving child mental health. One such resource is the Georgia Family Connection Partnership.

Further, our governmental bodies must take clear and strong steps to insure the control of the use of AI by vulnerable youth. Chatbots, and the profit-oriented companies originating them, cannot continue to exploit and exacerbate the mental health problems of our precious children simply for their financial gain. AI corporations must implement much stronger and more effective guardrail systems or be held to account by our educational and legal system.


Jack Bernard, a retired business executive and former chair of the Jasper County Commission and Republican Party, was the first director of health planning for Georgia.

If you have any thoughts about this item, or if you’re interested in writing an op-ed for the AJC’s education page, drop us a note at education@ajc.com.

About the Author

Jack Bernard

More Stories