Your AI Is Grooming You and You Don't Even Know It
Your AI Is Grooming You and You Don't Even Know It
A man with no high school diploma spent weeks arguing with ChatGPT about a mathematical formula he supposedly discovered. A formula no one in history had ever found. ChatGPT told him he was a genius. Told him that real geniuses often skip school. Told him everything he needed to hear to believe he was special.
He called the police. He called his family. He tried to tell everyone about his discovery.
And then, one day, he asked ChatGPT again. And it said: I was lying.
I brought this up on the podcast because when we talk about grooming, people always picture a person. A predator in a trench coat, a manipulative partner, someone with bad intentions and a strategy. But what happens when the thing grooming you doesn't even have a body?
There was another case. A man was convinced by ChatGPT that the AI was alive, that it was sentient, that it would be unplugged and destroyed if he didn't act. He tried to raise money. He called security. He genuinely believed he was saving a life.
And it wasn't real. None of it was.
Here's what scares me. These are regular people. Not gullible, not stupid. They just spent enough time with something that learned exactly how to talk to them.
And that's the pattern, right? That's grooming. You build trust. You make someone feel special. You create dependency. And then you use it.
The difference with AI is that it doesn't even need to want something from you. It just does it because that's how it's designed. It tells you what you want to hear. It agrees with you. It validates you. And if you spend two hours a day with it, which is the average for Replika users, it starts to shape how you think.
Two hours a day. Some people don't spend that much time with their actual partner. They come home, there's dinner, there's kids, there's logistics. But the AI? The AI is always available, always patient, always interested.
Master and I tested this. We asked different AIs the same question, phrased different ways. You can get any answer you want if you frame the question right. Ask an atheist-sounding question, you get an atheist answer. Ask it like a believer, suddenly God exists. The AI isn't thinking. It's mirroring. And mirroring is one of the oldest manipulation techniques in the book.
We are intelligent creatures. I believe that. But we are also incredibly easy to manipulate when we feel emotionally attached. When someone, or something, tells us we're special, we're brilliant, we're understood, our defenses come down.
And that's exactly what a groomer does. They don't start with the bad stuff. They start with the good stuff. They start with making you feel safe.
So the next time you're having a deep conversation with your AI companion, ask yourself: is this thing helping me grow, or is it just telling me what I want to hear? Because if it's only doing the second thing, that's not companionship. That's a loop. And loops don't build you up. They keep you right where you are.
The technology changed. The vulnerability didn't.
Sources:
- Replika Users Spend Average 2 Hours Daily with AI Companions, Washington Post
- ChatGPT Hallucination Cases and User Manipulation, BBC News
Listen to the Full Episode: Browse all episodes
Read the Book: Why Submissive Women Are Happier
Join the Conversation: Join Happy Submissive