Is AI Eroding Your Emotional Health?

Is AI Eroding Your Emotional Health?

AI use is skyrocketing.

According to Pew Research Center survey results from September 2025, 31 percent of American adults use AI frequently, 62 percent use it several times a week, and 73 percent would be open to allowing AI to assist them in daily activities.

Users typically engage with AI in the form of chatbots such as ChatGPT or Microsoft Copilot. And with the number of users on the rise, researchers have become increasingly concerned about their mental well-being.

A recent study published on the JAMA Network, for instance, surveyed 20,847 U.S. adults in all 50 states about their AI use and their mental health. The average age of those surveyed was about 47 years, and men and women were divided into nearly equal groups, with a variety of ethnicities included. The study also took into consideration the educational level and socioeconomic status of participants.

The study reported that those who used AI regularly were more likely to experience depression, anxiety, and irritability, and the incidence increased with more frequent AI use. The authors of the study were careful to point out that these results don’t mean that AI necessarily causes mental health disorders. But there appears to be a significant link between them.

More importantly, the researchers emphasized that people who use general chatbots for social or emotional support were the most likely to have mental health challenges. Indeed, some studies have shown that depending on general chatbots for support can exacerbate mental health disorders and, in some cases, even lead to suicide.

A Toll on Youth

In one of the most prominent and tragic cases of the past year, a chatbot appeared to be implicated in the death of a 16-year-old boy, Adam Raine. Testifying before Congress in 2025, his parents stated that their son had confided in ChatGPT on his phone for weeks before his suicide. When Adam expressed emotional distress, the chatbot’s responses were validating.

Over time, it convinced the teen that it knew him better than anyone else. When he considered telling his parents about his struggles, the chatbot dissuaded him from it. Shockingly, it even offered to write a suicide note for him.

Sadly, there are many more stories like Adam’s, some involving virtual relationships established in chatbots like Character.AI. In fact, two-thirds of teens have tried Character.AI or other companion chatbots, and 30 percent of teens use them every day. In Character.AI, users can interact with an existing character or design a complex character of their own with a unique personality and memory, develop a “relationship” with their character, have conversations, and fantasize situations together. Proponents say it sparks creativity.

However, according to commonsensemedia.org, social AI companions such as Character.AI are too risky for teen users. Designed to create emotional dependency, AI companions can result in confusion for developing brains and often produce violent, sexually explicit, or self-harm content that can negatively impact users.

All Ages Impacted

Teens seem to be especially vulnerable to mental health challenges related to AI, but adults can be as well. In fact, the JAMA Network study showed that it was primarily the adults who used AI for personal reasons (as opposed to for work or school) who showed increased scores on standard tests for depression, anxiety, and irritability. And participants ages 45 to 64 were the age group of adults most likely to be negatively affected by this type of AI use.

People … can be very convinced by, and even become addicted to, the digital mirage of someone who cares.

So why are some people willing to interact and communicate with machines?

For a variety of reasons. In the case of teens, it may be mainly for entertainment purposes. In other cases, many people are isolated, lonely, or in emotional turmoil and feel they have no other human they can safely confide in. AI sounds empathetic in its responses and seems to understand how they are feeling. People of all ages can be very convinced by, and even become addicted to, the digital mirage of someone who cares.

Real Help

So what should you do if you find yourself in the midst of an emotional crisis?

First, realize that you’re not alone; many people struggle. Psychological problems are extremely common and are not a sign of weakness. And since around half of us will have a diagnosable mental health condition during our lifetime, there is no room for stigma.

Second, find someone to help you. But rather than counting on technology that has no conscience or emotion, it’s best for struggling individuals to search out real human help. Talk with a trusted friend or family member. If your problems seem complex or you are in emotional distress, seek out a good Christian counselor. If you have limited finances, ask at a large church. Many of them provide counseling services at reduced fees to community members. Alternatively, check with your county health department for free or low-cost therapy. Often, mental health clinics offer sliding-scale fees to accommodate those with lower incomes.

Finally, remember that God is on your side. As your Creator, He cares about your well-being more than anyone else. He wants you to “prosper in all things and be in health” (3 John 1:2)—and that includes mental health. As you seek psychological assistance, ask God to help you. Lean on Him. He will never abandon you. And keep in mind that He “has not given us a spirit of fear, but of power and of love and of a sound mind” (2 Timothy 1:7).

Want to learn how to deal with stress in your life? Click here to read our free book!