Skip to Content

WHY CONVERSATION AND EMPATHY ARE CRUCIAL IN THE AGE OF AI

January 5, 2026
Fredrik Scheja

Dialogue as a Lifeline: From Bomb Games to Elder Care

Late at night, in a dimly lit room, two friends hunch over a digital bomb-defusing game.
“Cut the red wire!” one shouts.
“Which red wire? There are two!” the other gasps, sweat on his forehead.

The game Keep Talking and Nobody Explodes demands exactly what its title suggests—keep talking, or it all blows up. One player sees a ticking bomb with cryptic modules but has no manual; the other holds the instructions but can’t see the bomb. Their only chance is to describe, listen actively, and think together. Misunderstandings or silence mean disaster.

This nerve-wracking exercise is a perfect metaphor for today’s IT projects: in complex development teams, each member holds a piece of the puzzle, but no one sees the full picture from the start. Persistent dialogue and coordinated listening become the lifeline that keeps projects alive when deadlines loom and risks flash red. Like in the game, teams avoid “explosions” by constantly adjusting course through conversation and feedback—and trust grows from this interplay. When someone says, “I need help, what does the manual say?” that moment of open communication transforms stress into progress.

Från spelet “Keep Talking and Nobody Esplodes”

Beyond Gaming: The Power of Conversation

The impact of dialogue extends far beyond the gaming world. Last spring, we helped a municipality in northern Sweden introduce an AI-based conversational partner in elder care. The goal was to see if a digital “companion” could improve safety for seniors and ease the burden on staff.

The results surprised everyone: the platform became a double value engine. For the elderly, AI created a sense of presence and companionship—a warm voice available even when staff couldn’t be there, reducing loneliness and increasing everyday security. For caregivers, it became a tool to quickly gauge each person’s well-being and history; if a door didn’t open in the morning, they knew something might be wrong, and the AI could even provide clues about what happened overnight.

In both roles, it wasn’t technical perfection that mattered—it was the ongoing dialogue that built trust, calm, and real utility. Empathy was embedded in the technology: the digital colleague “knew” personal details (asking about the cat, reminding about medication) and could alert staff if something seemed off. By mimicking human conversation, AI built bridges of trust. True progress depended on our ability to engage in genuine dialogue—with both people and machines—and to navigate uncertainty together.

AI-genererad illustration

Threshold Spaces: Creativity in Times of Uncertainty

An early morning in northern Sweden. A home-care worker steps into a silent stairwell. Two apartment doors remain closed; no sounds, no movement. The light is dim, the air still. It’s a space between worlds—between night’s solitude and day’s care, between worry and reassurance. Here, in this empty stairwell, a liminal space emerges.

Liminal spaces are transitional zones—physical or mental—where the old has been left behind but the new has yet to take shape. The word “liminal” comes from the Latin limen, meaning threshold. We stand on the edge: no longer in the past, not yet anchored in the future. And in this strange in-between, something magical can happen.

In home care, such moments are charged with empathy, attention, and presence. These thresholds invite creativity and transformation. The gaming world has long explored liminality. In the classic Myst, players awaken among misty islands and deserted libraries—no clear instructions, no immediate threats, just an enigmatic silence. This ambiguity sparks curiosity: without fixed rules, you’re free to experiment, solve puzzles in unexpected ways, and think beyond the obvious. Liminality becomes a creative catalyst—stepping outside your comfort zone opens new mental pathways.

Workplaces experience similar phases during transitions. Imagine a development team leaving behind an old method but not yet mastering the new. It feels messy: “Are we doing this right? Who decides now?” These in-between times can feel frustrating—like waiting in an airport lounge between flights—but they also offer fertile ground for innovation.

Spelet “Myst”

Research and experience show that uncertainty breeds creativity—if managed well. When no one claims to have all the answers, bold ideas surface. When routines dissolve, experimentation becomes the norm. Organizational psychology calls liminality a “fertile chaos zone”—at its best. It takes courage and guidance to harness it. Wise leaders mark transitions (celebrating the end of an old system before launching the new) and encourage an exploratory mindset.

When Data and Intuition Collide – Lessons from Game Development

In today’s development world, a silent struggle is underway: the clash between data and intuition. In game development, data-driven methods have exploded—everything is measured, tested, and optimized. The analyst’s voice is now as influential as the creative director’s. This brings superpowers: spotting flaws, testing design alternatives, and choosing the objectively best option.

But there’s a flip side. Many creators report a new kind of frustration: “Are the numbers taking over our vision?” One designer described it as the soul “bleeding” when a beloved idea gets rejected because it doesn’t fit the metrics. Meanwhile, analysts grow exasperated with gut-driven colleagues who ignore clear user data.

This tension is just as visible in system and business development. KPIs and dashboards steer decisions. But what happens to craftsmanship and creativity? A developer might know a feature will delight users—yet the numbers say otherwise. Do we dare stand our ground?

AI-genererad illustration

Managing this requires emotional resilience. Teams must take data seriously—without taking it personally. Like an author enduring an editor’s critique, developers must accept when data exposes flaws. But we can’t become data fundamentalists. Passion and meaning can never be fully quantified.


Finding Balance

The best game teams seat analysts and creatives side by side. Numbers become decision support—not decision makers. Behind every data point are real people with subjective experiences. When data contradicts an idea, address the frustration openly:
“What do you feel we’re missing if we only follow the numbers?”
And the creative can ask:
“Can we measure my intuition somehow?”

Once again, dialogue becomes the bridge—connecting feeling and logic.


AI – A New Colleague in a Constant In-Between

We stand at the threshold of a paradigm shift: AI as a creative collaborator. Generative models appear as idea generators, coding assistants, and conversational partners. Integrating AI feels like hiring an alien colleague—brilliant in some ways, baffling in others.

Organizations now live in a permanent liminal phase—oscillating between human and machine. The boundary of what is “our” work and what is “theirs” (AI’s) is constantly moving. It’s exciting—but mentally demanding.

Questions pile up:

  • How do you lead a team where one member is an AI?
  • How do you maintain safety and direction when AI generates unexpected ideas?
  • Can AI boost empathy during uncertain phases—as an emotional thermometer?
  • What happens to creativity when humans and machines co-create?

These questions have no fixed answers. We’re in terra incognita. AI doesn’t change what makes collaboration successful—it amplifies the need for it. Clear goals, strong communication, and trust matter more than ever.

AI acts like a magnifying glass—intensifying team dynamics. Good communication smooths integration and helps navigate friction. Poor communication breeds confusion and resistance. High trust and psychological safety give courage to challenge AI; low trust sparks frustration.

Think of AI integration as a vast liminal experiment. We stand between eras—and must shape collaboration together.


Empathy and Dialogue – Our Compass in the New Era

Ultimately, success doesn’t hinge on the perfect algorithm—it depends on our human ability to create mutual understanding, with both people and machines.

Empathy is our most vital compass. In development teams, it means sensing colleagues’ concerns, noticing when morale dips, and understanding user frustration. Empathetic communication turns stress into solidarity—and strengthens both bonds and performance.

In human–AI collaboration, empathy takes new forms. We need empathy for the user—AI executes tasks, but we design experiences. We also need empathy toward AI: understanding its limitations and “thinking style.” Treat AI like a new intern—talented but inexperienced. Give clear instructions, be patient, take responsibility.

Empathy brings calm and direction when the world feels uncertain. When pace is high and change relentless, pausing to check in with each other is priceless.

The most innovative organizations of the future will unite high technology with high humanity. They’ll recognize that every AI, every data strategy, every new method ultimately exists to serve people—and must be shaped by human values.


🔥 Conclusion: Keep Talking

Innovation doesn’t ignite spontaneously—just as fire doesn’t light itself. It needs fuel, warmth, and air.

  • Fuel: Skilled people with diverse perspectives.
  • Warmth: Respect, listening, and engagement—empathy.
  • Air: Space for dissent and experimentation in safe transitional phases.

The spark? It often strikes when two minds meet from entirely different starting points—and choose to collaborate.

So: Keep talking. Keep listening, sensing, and sharing. In a world where AI and algorithms join our teams, embracing dialogue and in-between states is critical. Fill uncertainty with curious questions and shared insights. Then we’ll be ready to defuse complex problems, ignite new ideas, and guide each other through every foggy corridor.

Conversation and empathy are the bridges that carry us across the threshold into tomorrow’s world.

Reference list – for those who want to research the subject more deeply:

  1. Automating Liminality in Foresight Practice
    Journal of Futures Studies, 2025
    Explores how AI and generative “hallucination” can create liminal states of thought for innovation and futures work.
    https://jfsdigital.org/2025-2/vol-29-no-4-june-2025/automating-liminality-in-foresight-practice/
  2. Navigating the Human-AI Divide: Boundary Work in Conversational AI
    ScienceDirect, 2025
    Discusses how dialogue and reflective collaboration blur the boundaries between human and artificial intelligence in development projects.
  3. Empathy Toward AI Versus Human Experiences
    Palo Alto University, 2025
    Large-scale empirical study on how empathy is established and influenced in conversations between humans and AI under different conditions.
  4. Empathetic Conversational Agents: Utilizing Neural and Physiological Signals
    Tandfonline, 2025
    Research on multimodal, emotionally aware conversational AI and their effect on deeper engagement and emotional response.
  5. A Data-Driven Analysis of Prompting vs. Prompt Engineering
    LinkedIn Pulse, 2025
    Differentiates between prompting as intuitive dialogue and prompt engineering as command—analysis of what truly creates value.
  6. Caregiving Artificial Intelligence Chatbot for Older Adults
    JMIR, 2025
    Study of AI chatbots and their impact on safety, closeness, and information flow in eldercare—relevant for conversational support.
  7. Effects of Interaction Modalities and Emotional States on User’s Perceived Empathy with an LLM-Based Embodied Conversational Agent
    ScienceDirect, 2025
    Analyzes how advanced conversational AI can create perceived empathy and support for vulnerable user groups.
  8. Myöhänen, T. et al. (2025). “Emotions in Game Data Work”.
    New Media & Society
    The study examines how emotions and identity are affected when data and algorithms drive game development—relevant for understanding emotional data in digital innovation projects.

About the author

Consultant | Sweden
Testing is the acts we do to ease our curiosity while we develop the things we love. I create models that enables faster understanding of complex matters for better judgments on our journey towards authenticity.

Leave a Reply

Your email address will not be published. Required fields are marked *

Slide to submit