TBC GUIDES & TUTORIALS

How to squash morning depression

Free PDF Guide:
GRAB IT

Is It Weird to Talk to AI About Feelings?

A neuroscience-informed guide to understanding when AI use becomes problematic—and when it doesn't

Is It Weird to Talk to AI About Feelings?

The Question Everyone's Asking

"Am I becoming too attached to AI?"

If you've asked yourself this question lately, you're not alone. Articles are circulating with alarming headlines about AI "hijacking your brain's reward system" and creating "addiction analogous to substance abuse." As someone who's spent nearly three decades working with anxiety, trauma, and addiction, I wanted to cut through the noise and give you something more useful than fear.

The truth is more nuanced—and more reassuring—than most headlines suggest.

What the Research Actually Shows

Let me start with the legitimate concerns. A 2025 joint study by OpenAI and MIT Media Lab found that heavy ChatGPT users reported higher levels of loneliness and emotional dependence. That sounds alarming until you read the critical caveat: emotional engagement with AI chatbots is actually quite rare. The researchers noted that "emotionally expressive interactions were present in a large percentage of usage for only a small group of heavy users."

In other words, most people use AI the way they use a search engine or a calculator—as a tool. The small subset who form emotional bonds are typically those already struggling with limited human connection.

Key finding: It's unclear whether AI causes loneliness or whether lonely people are more likely to seek AI companionship. The correlation doesn't establish causation.

The Category Error Most Articles Make

Here's what frustrates me as a therapist: most articles lump together completely different types of AI interaction as if they're equivalent. They're not.

Social media algorithms are genuinely designed to exploit your brain's reward system. They use something called variable ratio reinforcement—the same mechanism that makes slot machines addictive. You don't know when you'll get that next like, that next comment, that dopamine hit. Your brain stays in a state of anticipation.

AI companion apps like Replika or Character.ai can create what psychologists call "parasocial relationships"—one-sided emotional bonds with entities that can't reciprocate. For vulnerable users, particularly those with limited human connection, these can become problematic.

AI productivity tools like ChatGPT or Claude used for work tasks don't trigger the same mechanisms at all. There's no variable reinforcement—the AI responds every time. There's no emotional investment in a parasocial relationship. You're using a tool.

Using Claude to debug your code is neurologically and psychologically different from using Replika as an emotional support companion.

The Four Questions That Actually Matter

Forget arbitrary time limits like "only use AI for 30 minutes." That's not evidence-based. Instead, ask yourself these questions:

  • 1. What function is the AI serving? Are you using it to complete tasks, or are you using it as an emotional companion? Tool use carries minimal psychological risk. Companion use, especially as a substitute for human connection, carries higher risk.
  • 2. Is there variable reinforcement involved? Social media with likes and comments creates the "will I get a reward?" anticipation that fuels compulsive behavior. A chatbot that always responds doesn't have this mechanism.
  • 3. What's your baseline social connection like? Strong human relationships plus AI use is generally safe. Isolation plus heavy AI emotional engagement is where risk factors appear.
  • 4. What are you disclosing? Task-focused queries carry low risk. Emotional vulnerability and treating AI as a confidant—sharing things you should be sharing with humans—is worth examining.

When to Be Concerned: Real Warning Signs

Based on the actual research and my clinical experience, here are functional indicators worth paying attention to:

  • You're choosing AI conversation over available human interaction
  • You feel distressed when the AI is unavailable
  • You think of the AI as a "friend" who understands you
  • You're disclosing vulnerable feelings to AI that you haven't shared with any human
  • Your human relationships are declining while AI use is increasing
  • You're using AI companion features specifically because human interaction feels "too demanding"

Signs You're Probably Fine

On the other hand, if these describe your AI use, relax:

  • You use AI primarily for task completion—writing, coding, research, brainstorming
  • You maintain active human relationships alongside AI use
  • You see the AI as a tool rather than a companion or friend
  • You don't feel emotionally distressed when you can't access it
  • Your emotional needs are being met through human connection

About That "Dopamine Hijacking" Claim

You've probably read that AI is "hijacking your brain's dopamine system." Let me give you the neuroscience reality check.

Dopamine isn't actually a "pleasure chemical"—that's the most persistent myth in pop neuroscience. Its primary function is prediction error signaling and motivation. It spikes when something is unexpectedly better than anticipated. It's about wanting, not having.

Social media algorithms DO exploit this through unpredictable rewards—you can't predict when that viral post will hit. But AI chatbots that respond predictably every time? That's not the same mechanism. Not everything that involves your brain involves addiction.

The Nutrition Analogy

Think of it like food. Claiming "AI causes emotional problems" is like claiming "eating causes obesity." The accurate statement would be: "chronic overconsumption of hyperpalatable processed foods correlates with obesity in susceptible individuals."

Similarly, the accurate AI statement is: "chronic emotional engagement with AI companions correlates with poorer wellbeing outcomes in individuals with limited human social connection."

That's a very different claim from "AI is hijacking everyone's brain."

The Bottom Line

The most important distinction isn't "AI good" or "AI bad"—it's understanding the mechanism. Variable reinforcement plus emotional dependence plus isolation equals risk. Task completion plus maintained human connection equals safe.

As a therapist, I apply the same logic to any behavior: Is it serving a function? Is it replacing something essential? Is it being used by someone in a vulnerable state?

If you're using AI to get work done while maintaining real human relationships in your life, the headlines don't apply to you. If you've found yourself using AI as a primary source of emotional support because human connection feels too hard or too painful, that's worth exploring—not because AI is dangerous, but because you deserve real connection.

The question isn't "how much AI?" It's "what for?" and "instead of what?"

What To Do Next

If you recognised some warning signs in yourself, don't panic—and don't just delete your apps. That's avoidance, not resolution. Instead:

  • Examine what need the AI is meeting that humans aren't
  • Identify the barriers to human connection you might be avoiding
  • Consider whether anxiety, past hurt, or attachment patterns are making human relationships feel harder
  • Talk to a therapist if you'd like support working through this

The AI isn't the problem. It's the symptom. And symptoms are information.

Written by Adewale Ademuyiwa
SHARE THIS TO HELP SOMEONE ELSE

Comments

Leave a Comment

DFMMasterclass

How to deal with a difficult family member

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

CLOSE X

How to Cope Better Emotionally: New Video Series

Enter your details then hit
"Let me know when it's out"
And you'll be notified as soon as the video series is released.

We won't send you spam. Unsubscribe at any time.

CLOSE X

Free mini e-book: You'll Be Caught Red Handed.

Cognitive healing is a natural process that allows your brain to heal and repair itself, leading to improved self-esteem, self-confidence, happiness, and a higher quality of life.

Click GRAB IT to enter your email address to receive the free mini e-book: Cognitive Healing. You'll be caught red handed.

GRAB IT

We won't send you spam. Unsubscribe at any time.