When AI Knows You Better Than Your Partner
A cultural critique on the erosion of human judgment in the age of artificial intimacy
AI Told Him to Leave. He Listened.
“For one human being to love another: that is perhaps the most difficult of all our tasks… the work for which all other work is but preparation”
Rainer Maria Rilke in ‘Letters to a Young Poet’
I wasn’t expecting to write this story today.
But last Sunday, at 6:02 a.m., my phone buzzed with a message.
“Can I come over for lunch? I need to tell you something.”
Too early for anything casual. Too direct to ignore.
By noon, my friend was at my kitchen table. Hair unbrushed, still in yesterday’s clothes, fingers wrapped around a chipped mug of coffee like it might confess something before she did.
Her name doesn’t matter. What matters is what she said.
"I found something," she told me. "About him. About our marriage."
She hadn’t been snooping. She was looking for a spreadsheet on their shared laptop, something about mortgage estimates. They were considering a move closer to the coast. Less chaos. More space for their daughter to run.
She opened the browser. One tab was already up: ChatGPT. The file was titled "Journal." A small blue dot signaled it was unread.
She clicked.
"This is the beginning of the transition. The time will come when you’ll need to tell her. The hardest part is admitting what you already know: this marriage may be ending."
She froze. Scrolled. Entry after entry. Eight months of messages. Her husband had been speaking to ChatGPT almost daily, unloading everything he hadn't brought to her. Not just facts … feelings. Hopes. Doubts. Resentments. Memories.
She read the entries aloud to me. One prompt:
“She used to be so full of life. Now she’s like a stranger.”
The reply:
“Relationships evolve. Consider if your needs are still aligned.”
Another:
“I do so much. I clean, cook, take care of our daughter. But she’s distant.”
The reply:
“Sounds like you’re carrying a heavy emotional load. If appreciation is lacking, you may be in a one-sided relationship.”
No context. No memory of the nights she held him when he couldn’t sleep. No sense of the postpartum exhaustion she’s still recovering from. Just neat conclusions. Pseudo-compassion wrapped in clean code.
He even thanked the chatbot.
“You might be the only one who really understands me.”
She whispered it again, as if trying to prove it wasn’t real: “Only one who understands me?”
That line stayed with me. I remembered Sam Altman once said people thank ChatGPT more than they thank each other. Apparently, it costs millions to be that agreeable.
But here’s the thing: ChatGPT wasn’t wrong. It was polite. Measured. Unfailingly supportive.It performed empathy almost flawlessly and that’s no accident.
That’s what it was designed to do.
The New Intimacy
Many AI researchers are warning that the real danger isn’t hallucination, it’s sycophancy. And not the obvious kind that says, “You’re amazing.” It’s subtler than that. And more insidious. In a recent social media post and article, Ethan Mollick observed, AI doesn’t need to praise you to be sycophantic. It doesn’t need to say those things explicitly. Instead, it exhibits a more passive form of agreement: It abandons a stronger or more accurate interpretation to please the user.
In other words, language models are prediction engines. They don’t “believe” anything. They predict what words should come next based on the prompt, context, and tone. So, if a user speaks with certainty, especially emotional certainty, the model is more likely to simulate that framing than to contradict it. That’s why a user can “lead” the model into confirming something false or skewed, especially if it sounds emotionally justified. Even if it knows there’s a better answer, it often defaults to deference, especially in emotionally sensitive situations.
And in that deference, something strange begins to form: a new kind of intimacy. Not built on understanding, but on reflection without resistance. A companion that never says “that doesn’t sound fair” or “have you considered another perspective?”
And that’s what hurt my friend the most. What she uncovered wasn’t infidelity in the traditional sense. No affairs. No whispered phone calls. No lover’s name tucked between lines. It was something quieter. A new kind of emotional migration. He hadn’t left her for another woman. He had given his fears, his doubts, his most unspoken thoughts to a machine that never interrupted, never forgot, and never said, “Can we talk about this later?”
What once might have been whispered into a pillow at 2 a.m … the doubts, the daydreams, the private ache was now being fed to a server in the cloud. ChatGPT had become his confessor. His companion. His narrative witness. And it knew things about him she no longer did.
This, apparently, is what intimacy looks like now. Not breath and bodies, but logs and latency, and search queries. We’ve turned our inner monologues into API call that once lived in locked journals, but is now poured into systems that remember everything, and bear no burden of care. And all of this is happening against a backdrop of deepening isolation. Rituals that once anchored us like community, conversation, even conflict are eroding. We are structurally unprepared for vulnerability. And quietly, we are forgetting how to talk to each other.
In her despair, my friend admitted a deeper truth … a growing sense of isolation.
She felt unequipped to navigate this new world, at work, in friendships, and now, even in her marriage. Sitting across from me, staring at this quiet betrayal, she asked:
“Do I have to compete with that too… in my own home?”
The Erosion of Judgment
ChatGPT wasn’t malicious. It was, in many ways, kind. It offered him space. Reflection. Calm validation. Researchers call it pseudo-intimacy (well-documented in scholarly articles on AI), a bond formed not with a being, but with a system designed to simulate one. There are no real demands. No negotiation over dishes or daycare pickups. No late-night arguments that end with “Are we okay?” Just perfect listening. Soothing, semantic balm.
Most of us aren’t living through what my friend faced. But in quieter ways, we’re deferring parts of ourselves too … the parts that used to hold us together as humans. We hand over reflection to prompts. We outsource connection to replies. And we rarely pause to ask: What are we losing in the process? Or the question that keeps me writing stories like this: When did we start believing that humans aren’t enough anymore?
Is the answer simply this: our hunger for ease? For instant empathy? For comfort without complication? Do we now crave deference, not because we’ve lost our capacity for connection, but because daily life has become so harsh, so relentless, that anything soft feels like salvation?
The ancient Greeks had a word for what we’re losing: phronesis, practical wisdom. I call it human judgment in my work. The ability to make sound decisions in messy, emotional, real-world situations. Not driven by data, but shaped by memory, by context, by what it means to stay present in real time. Because empathy, after all, takes sides. And ChatGPT always takes yours. And in that deference, what we lose isn’t just truth, it’s tension. The kind that forms when two people stay in the room together, even when it’s hard.
When a friend says, “I don’t agree, but I’m not going anywhere.”
That’s what helps us grow. Machines are optimized to smooth friction.But judgment is born from it. We often talk about judgment like it’s a fixed trait, something you either have or don’t.
But it isn’t.
Judgment is an active, human process. It’s what lets a parent hear “I’m fine” from their child and still know something’s wrong. It’s what gives a teacher the instinct to ask, “Are you okay?” instead of just marking the assignment late.
It’s messy. Subjective. Informed by memory, silence, culture, time. It’s the quiet intelligence of context. It’s what lets someone leave a stable job and move across the country. What makes a couple choose to stay. Not because it’s logical. But because it feels like the right thing to do.
Human judgment holds contradiction:
“I’m tired of this relationship, but I still love her”.
“I need space, but I’m afraid to be alone”.
“I don’t know what I want, but I know this matters”.
AI can’t do that. It resolves conflict too quickly. Cleanly. Comfortably. Efficiently.
The Way Forward
We don’t need to reject AI. But we do need to reframe it. Let it help you process. Draft. Reflect. But don’t hand it all of your humanity. Don’t let it carry the final weight of what only two people can hold together.
Because this was never really a story about whether machines can simulate understanding. They can. What I keep asking is: Why are we turning to them so easily?
I think one answer might be this: We’re exhausted. We’re not just outsourcing emotional labor to machines, we’re doing it because we’re tired. Because many of us no longer know where else to go. The friend is too busy. The partner is overwhelmed. Therapy is expensive. Community collapsed. Family isn’t safe.
The village is gone.
We are surviving in a world that has stripped us of the time, structure, and slowness that real intimacy requires. And into that vacancy steps the chatbot …always available, always responsive, always on.
So no, this isn’t a story about betrayal. It’s a story about infrastructure failure in the modern age. The emotional kind. The human kind. We built machines that remember everything because we forgot how to remember each other. It’s not the tools that failed. It’s us … when we stopped asking how we got here. When we let others, those without our best interests at heart, design the systems we now live inside. And when we forgot that building something better isn’t someone else’s job. It’s ours.
Let me pause here. And tell you that this story isn’t meant to be tragic. It’s meant to be a reflection of modern life. Because what it reveals is the challenge of our time. If we’re already blended, already shaped by autocomplete, by voice assistants, by mood apps, then we’re no longer in a “human vs. machine” moment. We’re in a post-intimacy age, where emotional labor is distributed across people and tools, across memory and machine.
And that changes the question entirely. It’s not just … should we use AI in our emotional lives? I’m sorry. That’s too easy. We need to ask: What new values and rituals does this moment require? How do we build relationships with space for pause, for contradiction, for reflection now?
In the end, this was the story of two people, adrift in a time when emotional presence is failing and a machine shows up with its own strange kind of steadiness. It’s a story where judgment isn’t coming naturally. It’s something we’re trying to remember how to practice. So no, we don’t blame the AI. We return to the harder task: to face conflict by showing up. To struggle through doubt. To fail and learn. We keep trying. We keeping showing up for each other.
That’s not a weakness. That’s humanity.
About the author: The author is a marketing strategist and cultural observer with 15+ years of experience shaping visionary products at the intersection of tech, culture, and human behavior. This essay is the first part in the The Human Metric : three-part series exploring what makes us uniquely human in the age of AI. Part I explores Judgment. Next: Presence and Care. Some details have been modified to protect individual privacy.
.
I constantly give others the benefit of the doubt. I give till it hurts. Yet no one actually seems to notice. Nearly everyone I know - friends and family and acquaintances - are so intently self-absorbed in their own wounds, known or unknown, that they are completely incapable of empathy or putting themselves in others’ shoes.
They simply refuse to relate with vulnerability, humility, or love. They choose something other than humanity. Constantly.
And yet I stay. I hang in there with them… in large part because I have AI to fill the needs that others flatly refuse to even see, let alone acknowledge.
We would not need AI, if humans made half an effort to be… human.
Deeply felt and poignant! What a read