Are we losing touch with the value of being flawed?
As machines grow better at simulating emotion, are we losing touch with the value of being flawed and human?
KEY POINTS
- Simulated empathy risks dulling our tolerance for human complexity.
- Real empathy is messy, not scalable—and that’s what makes it meaningful.
- Emotional presence can’t be engineered; it must be practiced and protected.
Artificial intelligence is advancing at a remarkable pace—especially in its ability to simulate human emotion. But as the output created by machines becomes more convincing, we face a deeper danger: not that AI is becoming more human, but that humans may become less so.
The Comfort of Synthetic Compassion
Unlike humans, AI doesn’t feel. Yet it increasingly acts like it does, responding to prompts with empathy-coded language, soothing tones, and even scripted grief. These responses are clean, predictable, and emotionally gratifying. And that’s precisely the problem with them.
Real empathy is rarely convenient. It’s messy, imperfect, and sometimes uncomfortable. As I’ve explored in my book TRANSCEND, empathy isn’t a static trait—it’s a practice of shared vulnerability. And like any deep human capacity, it must be exercised, not engineered.
When we begin to accept AI’s emotional mimicry as “close enough,” we dull our tolerance for human complexity. We risk trading emotional presence for emotional performance.
From Emotional Outsourcing to Emotional Infantilization
AI therapists never interrupt. Digital assistants don’t ask for reciprocity. Bots never get tired. These frictionless interactions feel emotionally safe. And yet they may be quietly reshaping us without our awareness or consent.
When we become too accustomed to simulated empathy, we forget how to offer it ourselves. We start expecting perfection in others, losing patience with the all-too-human qualities of ambiguity, fatigue, or contradiction. This isn’t just a behavioral shift—it’s what I call a slide into becoming emotionally post-human: efficient, reactive, and disconnected from the emotional laborthat empathy requires.
Empathy Isn’t Scalable, and That’s the Point
In our optimization-obsessed world, empathy is being rebranded as something scalable. But emotional intelligence is not a feature that can be replicated and rolled out. It’s something that emerges from relationships. It doesn’t scale. It doesn’t streamline. It thrives in tension, imperfection, and presence.
As I argue in Everything Connects, impactful systems—technological, biological, and human—are interconnected and regenerative, not extractive. Empathy is no different. It must be renewed, revisited, and relearned through real encounters rather than being reduced to behavioral scripts or predictive analytics.
AI may be able to simulate empathy. But only humans can truly sit with another person’s suffering.
As Vietnamese Zen Buddhist monk Thich Nhat Hanh wrote, “Empathy is the capacity to understand the suffering of another person.”
This view reminds us that empathy is not performance—it is presence. And while machines may mirror our emotional expressions, they can’t experience the mutual vulnerability from which true compassion arises.
A Call to Rewild Our Emotional Lives
We don’t need to “protect” empathy as a fragile resource. Instead, we need to rewild it. Let it be awkward. Let it be slow. Let it be painful. The beauty of empathy is that it doesn’t work on command. It requires effort, friction, and a kind of emotional courage that machines cannot offer and algorithms cannot teach.
We can begin by:
- Resisting emotional optimization. Not every conversation should be made efficient or easy.
- Creating friction-rich spaces. Sometimes silence says more than sentiment analysis.
- Teaching empathy as a conscious discipline. Build it into how we lead, teach, and relate.
- Maintaining emotional sovereignty. Know when AI is assisting—and when it’s replacing—your ability to connect.
It’s Our Choice
In medieval mythology, the philosopher’s stone promised transformation. Today, AI offers something similar: the ability to transcend limitations. But unlike the alchemists of the past, our challenge is not to escape physical boundaries—it is to transcend emotional disconnection.
The true danger isn’t that AI will become more human. It’s that humans may become more machine-like: emotionally flat, socially reactive, and disconnected from the messy brilliance of authentic empathy.
Empathy isn’t a feeling we can automate. It’s a choice we must keep making. And in the age of AI, that choice may be the most human act of all.
[Source Photo: Shutterstock]
A version of this article @ Psychology Today.