← Back to Happy Submissive

AI Learned Its Sexism From Us — And We're Shocked?

By marian • March 25, 2026

AI Learned Its Sexism From Us — And We're Shocked?

I asked an AI to generate an image of a family in the kitchen. Classic family, nothing more. The woman ended up near the stove. The man was just... there. Holding a kid maybe, but not cooking.

I wasn't surprised. I knew it would do that before I even hit enter.


This Sunday is International Women's Day, and every year the debate gets louder. Feminists against tradwives, empowerment against tradition, and everyone yelling about progress while ignoring the thing staring right at them — that we haven't actually changed as much as we think we have.

And AI just proved it.

See, AI isn't some alien intelligence that arrived on Earth with its own set of beliefs and values. It didn't wake up one morning and decide women belong in kitchens. It was trained on us. On our data, our content, our Reddit threads, our tweets, our history. And most of that history — the stuff that's been digitized and fed into these systems — comes from the last hundred years of human culture. The 1950s stay-at-home mom. The advertising industry using women's bodies to sell everything from beer to car insurance. The romance novels. The fairy tales.

A UNESCO study found that generative AI describes women in domestic roles four times more often than men. Four times. But before you blame the algorithm, ask yourself — who wrote all the data it learned from?

We did.


What I find fascinating — and honestly a little sad — is that people are genuinely offended when AI reflects our biases back at us. As if the mirror is the problem, not the face looking into it.

Master and I talk about this a lot. He runs an AI channel, and he's tested every image generator out there. You prompt "a family in the kitchen" and you get gender roles from 1957. You can adjust the prompt — put the man at the stove, the woman at the table — and it'll do it. It's not going to say "this violates community guidelines." It just does what you ask. But the default? The default is us.

And here's where it gets really interesting for International Women's Day: even AI women are being exploited. Men are creating AI-generated women — perfect, compliant, endlessly available — and selling those images to other men. The technology changed, but humanity did not. It's the same thing that happened in AOL chat rooms twenty-five years ago. Someone puts up a girl's picture, pretends to be her, and profits. Only now the girl never existed in the first place.

Research from Northeastern University shows women are more skeptical of AI than men. And a study from Trinity College Dublin found that people exploit female-labeled AI the same way they exploit real women. The bias isn't in the code. It's in us.


So when someone asks me what I think about AI sexism on International Women's Day, my answer is simple: if you want AI to be less sexist, change the world around you. Because right now, the only thing AI is doing is taking the information we gave it and reflecting it back.

The technology changed. We didn't.

And maybe that's the real conversation we should be having this Sunday — not about what AI thinks of women, but about what we've been teaching it to think.


Sources:

  1. Generative AI: UNESCO Study Reveals Alarming Evidence of Regressive Gender Stereotypes — UNESCO
  2. Gender Differences in AI Risk Research — Northeastern University
  3. Humans Bring Gender Bias to Their Interactions with AI — Trinity College Dublin

Listen to the Full Episode: Browse all episodes

Read the Book: Why Submissive Women Are Happier

Join the Conversation: Join Happy Submissive