A growing number of young people are turning to AI chatbots for help with everyday choices. This includes situations where they might once have asked a parent, teacher, or friend. OpenAI, the company behind ChatGPT, is watching the trend closely as it begins to show signs of emotional dependence in some users.
AI Tools Are Playing a Bigger Role in Teens' Lives
During a recent banking policy event, OpenAI’s leadership said they’ve seen signs that some teenagers are using ChatGPT for more than simple questions or tasks. For these users, the chatbot has become part of how they make personal decisions, sometimes even shaping how they approach relationships or daily habits.
The concern isn’t based on whether the advice is correct. It’s more about the way some users describe the tool, as something that knows them and understands their world. That kind of emotional framing could change how people, especially adolescents, form habits around problem-solving and self-trust.
Many Teens Say They Trust the Advice AI Provides
In a recent survey conducted by Common Sense Media, most teens reported that they had tried an AI companion. Among them, nearly half said they trusted the advice to some degree. The younger group, aged 13 to 14, showed slightly higher trust levels than older teens.
Some respondents said they trusted their chatbot’s suggestions quite a bit or even completely. That response didn’t necessarily reflect the accuracy of the answers. Instead, it appeared to come from the way the interaction felt. The tone, familiarity, and style of the AI’s responses may be reinforcing a sense of emotional closeness.
OpenAI Is Researching the Pattern
The company didn’t expect people to form strong emotional ties to the chatbot. But now that it’s happening, they’re trying to understand how to respond. While ChatGPT is designed to provide helpful answers, it wasn’t built to serve as a life guide or emotional companion.
There’s no official guidance yet on how to limit overuse. Still, OpenAI is now examining how users interact with the tool, especially in cases where the pattern suggests dependency. They’re not labeling the tool as harmful, but they are watching closely to see how it’s affecting decision-making in younger users.
Emotional Use of AI May Shape Future Habits
This isn’t just about new technology. It reflects a shift in how some people are choosing to handle uncertainty or emotional stress. If younger users grow accustomed to turning to AI in moments of confusion, they may miss out on developing some of the social or reflective skills that typically grow during adolescence.
As developers continue to improve large language models, they may also need to understand the emotional side of user behavior. It’s not just about what AI can do, but how it fits into people’s personal lives, especially when the users are still learning how to think through difficult choices on their own.
Notes: This post was edited/created using GenAI tools. Image: DIW-Aigen.
Read next: