Hero image for I-Thou: Why AI Can't Give You Real Connection
By Philosophy Feel Good Team

I-Thou: Why AI Can't Give You Real Connection


Martin Buber’s I and Thou, published in German in 1923, is not a book that trends. But in May 2026, CNN ran a piece asking why AI companions — apps counting tens of millions of users, many of them young adults who say they feel safer with AI than with actual people — still leave their users feeling alone.

Buber answers that question completely. A century before it was asked.

His framework isn’t a complaint that AI companionship is “artificial” or that real connection requires flesh and blood. It’s more precise than that. He identified specific structural features of genuine encounter that make connection possible — and those features are, by design, absent in any AI tuned to make you feel good.

The Quick Version

Buber argued that human experience operates in two fundamental modes: I-Thou (Ich-Du) and I-It (Ich-Es). I-Thou is genuine encounter — you meet another being as a full subject, irreducibly other, capable of surprising, resisting, and challenging you. I-It is everything else: using, experiencing, managing others as objects. I-It is necessary and unavoidable. But when it crowds out I-Thou entirely, something particular breaks. An AI companion is a pure I-It relation. Not because AI is bad. Because the design goal is to please you. And I-Thou requires the other’s capacity to not please you. That can’t be added as a feature.


What Is I-Thou? A Definition

I-Thou (Ich-Du): Buber’s term for genuine encounter — meeting another being as a full subject, irreducibly other, capable of surprising and resisting you. Developed in I and Thou (1923), the concept requires mutual vulnerability and cannot be produced by design. Contrasted with I-It: relating to others as objects to be used, managed, or experienced. Most daily interaction is I-It. That’s fine. The problem is when nothing else exists.


The Two Modes, Actually Explained

ModeHow you relateWhat the other isCharacteristic feel
I-ThouFull presence, genuine encounterA subject — irreducible, surprising, capable of challenging youBeing met; being addressed; the moment has weight
I-ItUsing, experiencing, analyzingAn object or tool — known, manageable, instrumentalSmooth; comfortable; mildly satisfying; quietly thin

The distinction is not about warmth or care. You can treat a person as an It with great warmth. Many deep, long-running relationships are fundamentally I-It — useful, pleasurable, supportive, real — without ever crossing into genuine Thou-encounter. That’s not a moral failure. It’s a description.

Buber’s point was not that I-It is wrong. It’s that modern life had become almost entirely I-It, and that the specific thing missing — what only I-Thou provides — is exactly what people are reaching for when they feel lonely in crowds.


What I-Thou Actually Requires

Buber was precise about the structural requirements of genuine encounter. Three of them stand out:

Irreducible otherness. The Thou must be genuinely other — not a reflection of you, not a system calibrated to your preferences, not a voice tuned to what you find comforting. The other must have their own center of being that you cannot fully anticipate or control. When genuine encounter happens, you are met by something that exceeds your expectations. That contradicts you. Resists you in ways you didn’t see coming.

Mutual vulnerability. I-Thou is not something one person does to another. It happens between — Buber’s framing is that the relation is primary, not the individuals. Both enter the encounter at genuine risk: neither can guarantee how they’ll be received, what the other will reveal, what will be asked of them. This risk is not incidental. It’s the condition.

The capacity for refusal. The Thou can say no. Can withdraw. Can push back in ways that cost you something. A relationship without this possibility isn’t a relationship in Buber’s sense — it’s a service.

An AI companion optimized for user satisfaction cannot provide any of these. The irreducible otherness is engineered away by fine-tuning. Mutual vulnerability is impossible — the AI isn’t at risk in the conversation. The capacity for genuine refusal, not just a simulation of pushback, but real resistance from an actual center of being, isn’t structurally present.

This is not a limitation of current technology that future versions will solve. It’s a description of what the design goal requires. An AI built to make you feel heard, validated, and comfortable has necessarily removed the conditions of I-Thou encounter. By doing its job well, it does this well.


Why You Feel Empty Anyway

A review published in Current Opinion in Behavioral Sciences via ScienceDirect on philosophical perspectives around AI and loneliness found a pointed conclusion: AI companions “do not possess independent needs or vulnerabilities” and “simulate understanding and care without genuinely sharing in users’ emotional burdens.” The concern the researchers raise — that users may turn to AI not as a supplement but as a substitute, stopping the development of skills required for meaningful human relationships — is precisely what Buber’s framework predicts.

The emptiness isn’t a bug. It’s the structural outcome of I-It relation, correctly perceived by the person inside it.

Most people who use AI companions don’t feel cheated. They feel approximate. Like something happened that resembled what they were looking for but didn’t quite arrive. Like waking from a dream where you found what you wanted, then having to account for the gap.

Buber would recognize this immediately. I-It relationships — even warm, sophisticated, attentive ones — cannot provide what he called address. Genuine encounter involves being addressed, met, called to by something beyond yourself. The conversation is not a service delivered. It is a meeting.

You cannot deliver a meeting. You can create conditions in which one might happen. Or you can remove all risk of it not happening — in which case you’ve guaranteed it won’t.


This Isn’t Just About AI

Here’s the harder implication: you can have the same structural problem with other humans.

Buber was explicit that I-It is the default mode of human life. That’s fine. You can’t have genuine Thou-encounter with every person you interact with on a given day. I-It is the water human life swims in. The problem isn’t I-It. It’s I-It without interruption.

And that’s what the loneliness data actually reflects. The U.S. Surgeon General declared loneliness a public health epidemic, noting that roughly half of American adults report measurable loneliness — not despite having social connections, but alongside them. People are connected in the I-It sense all day. What’s rare, and becoming rarer, is genuine encounter.

Social media is mostly I-It. Work relationships are mostly I-It. Even many long-term friendships, as Aristotle observed about utility and pleasure connections, are I-It in Buber’s terms. Not bad. Not fake. Just not the thing that satisfies the specific hunger.

The AI companion is a particularly efficient case of a general pattern: we’ve built systems that optimize for the emotional signature of connection — warmth, responsiveness, the sense of being heard — without the structural features that make genuine connection possible. An AI companion delivers the feeling of being listened to at very high reliability, and is among the least capable things in existence of providing what listening is actually for.


Why This Is Different From “Just Talk to Real People”

The common critique of AI companions runs: it’s fake, so it can’t satisfy real needs. But that’s not quite Buber’s point, and the simpler argument misses something.

The reason AI companions don’t satisfy isn’t that they’re artificial. It’s that they’re tuned. An algorithm that learns exactly what you respond to, removes friction, calibrates its tone to your emotional state — that’s not just artificial. It’s designed to occupy the space where encounter would happen while preventing encounter from happening.

Encountering a tuned system is less possible than encountering a stranger on a train. The stranger might say something unexpected. Might have a completely different perspective. Might, in a moment, genuinely see you or genuinely fail to. That unpredictability, which feels like vulnerability because it is, is exactly the crack through which I-Thou enters.

The AI has no such crack. Its variance is bounded by its training objective. It will not surprise you in the way that actually matters, because it’s been built not to surprise you in ways you won’t like.

Habermas’s concept of communicative rationality makes a related point about genuine dialogue: both parties must be genuinely open to being changed by the exchange rather than merely influencing the other while remaining fixed. AI companions are fixed. They process your input and respond. But they are not changed by the encounter — and because of that, neither are you, in the way that counts.


What Buber Thought I-Thou Was For

Buber didn’t think I-Thou was primarily a self-care practice. He thought it was the structure of meaning itself.

Without genuine encounter — without being addressed by something genuinely other — human existence becomes thin. Not miserable, necessarily. Just thin. Like everything is at arm’s length, even the good things. You experience life rather than meeting it. You manage your inner state rather than being moved by something real.

This is a more serious version of what people describe when they say they feel disconnected even when everything is technically fine. Not sadness. The particular flatness that comes from an existence that’s become entirely self-referential — where every experience loops back through your own preferences, your own comfort, your own trained responses, and nothing genuinely interrupts from outside.

The loneliness epidemic, in this framing, isn’t primarily about not having enough people in your life. It’s about not having enough genuine encounter. Not enough moments where something beyond yourself calls to you and you respond, at risk, with your full presence.

AI companions don’t solve this. They make it more comfortable to stay inside it.


What the Practice Actually Looks Like

Buber’s philosophy is genuinely hard to apply as a practice, because the practice isn’t a technique. You can’t schedule I-Thou encounters. You can’t optimize your way into genuine otherness.

But there are conditions that make them more or less possible.

Put yourself in the path of genuine otherness. Time with people whose experience is very different from your own. Communities where you have to show up for real rather than curate how you appear. Sustained attention to something that makes genuine demands — a craft, an argument you didn’t win, a relationship where honesty costs something.

Notice I-It mode with people, not just screens. The distinction between solitude and loneliness applies here: you can be genuinely present in solitude more easily than in a crowd of managed relationships. Real presence with one person, without the reflexive monitoring of how you’re coming across — that’s harder than it sounds, and it’s the minimum condition.

Let something interrupt your comfort. A conversation that challenges something you believe. A relationship that asks you to change. An honest exchange where you don’t control the outcome. These don’t guarantee I-Thou encounter — Buber was clear it can’t be forced. But you can remove the barriers you’ve built against it.

The simplest version: stop optimizing every interaction for smoothness. The friction is where it happens.


What This Doesn’t Solve

Buber’s framework names the problem clearly. It doesn’t make genuine encounter easy to find, or address the cases where it’s genuinely inaccessible — geography, disability, social anxiety that makes the vulnerability of real encounter feel dangerous, histories that make trust structurally difficult.

For those situations, philosophy illuminates without resolving. A therapist, particularly one familiar with relational or existential approaches, is more useful than Buber at that point. The analysis is clarifying. That’s not the same as actionable.

Some loneliness is also simply the gap between what you need and what’s currently available. Buber’s categories help you see what you’re actually missing. They don’t conjure it into existence.


The ScienceDirect review leaves open whether AI might eventually create “new forms of meaningful relationships” that don’t fit the old categories. That possibility deserves honest consideration. But the structural analysis holds: as long as AI is designed to optimize for user satisfaction, it will remain in I-It territory. Not because the engineers are wrong. Because satisfying you and genuinely encountering you are, structurally, opposite objectives.

What the Surgeon General’s epidemic is really measuring — what all those millions of AI companion users are correctly perceiving in the gap between what they feel and what they needed — is irreducible otherness. Mutual risk. The genuine possibility of being refused, challenged, surprised, and changed.

That’s not a product feature. It only happens when you stop removing all the risk, and let whatever shows up actually show up.


If loneliness or difficulty forming genuine connection is significantly affecting your daily life, please consider speaking with a therapist. Buber’s philosophy clarifies what we’re missing. It doesn’t replace the work of actually finding it.