
There’s a fruit bowl in the middle of the table. Two apples rest inside it. One is sweet, crisp, packed with fiber and vitamin C. The other is hand-painted ceramic, made to look real enough that your first instinct might be to bite into it. But don’t. You’ll regret it. Still, from across the room, they’re identical. Same color, same shape, same soft shine reflecting off the skin. If all you wanted was the image of a fruit bowl, that fake apple would do just fine.
Now hold onto that thought. Because it might just explain the weird, surreal state of artificial intelligence in 2025.
We’re surrounded by simulacrums. Not in a sinister, sci-fi dystopia kind of way, but in the very real, kind-of-awkward, this-might-actually-work sort of way. AI-generated art that feels like it came from a real illustrator. Deepfake videos that make you do a double take. Voice assistants that joke, empathize, even flirt, depending on the prompt. Are they real? No. Are they real enough? Maybe that’s all that matters.
Jean Baudrillard, the French philosopher who coined the term simulacrum, would have had a field day with ChatGPT. He wrote about how, in a media-saturated world, representations start to replace reality. They don’t just reflect it, they become it. That’s what we’re seeing with AI right now. These tools don’t truly understand the things they mimic. But they mimic them well enough to pass as the real deal in a lot of situations. And if you can’t tell the difference, does it really matter?
That ceramic apple in the bowl can’t feed you. But if you’re just decorating your dining room for a Pinterest post, you’ll reach for it every time. It never rots. It looks perfect from every angle. And that’s kind of where we are with AI.
Let’s get specific. You ask an AI to write a poem about the ocean. It gives you something moody, rhythmic, sprinkled with metaphors about tides and longing. You post it. Your friends go “Wow.” Some even ask if it’s yours. Here’s the kicker — they’re reacting emotionally to something that has no emotions. The machine didn’t stand on a cliff and feel the salt air on its face. But you felt something when you read it. That’s the simulacrum in action.
And here’s the twist: for many use cases, that’s enough. You’re not asking for a soul. You’re asking for a vibe. A stand-in. A functional replica that can do the job without needing to truly be alive. That sounds cold at first. But really, it’s just efficient.
We already live in a world built on proxies. We watch actors pretend to be people in love. We eat plant-based meat designed to trick our taste buds. We follow influencers who curate their personalities like branding decks. The boundary between real and performative has blurred a long time ago. AI didn’t start that fire. It’s just dancing in the flames.
Let’s go back to the apples. If you’re hungry, you want the real one. But if you’re styling a magazine shoot? The ceramic version’s your star. It never bruises. It never wilts. And let’s face it — half the apples we buy end up as compost anyway. So, in that fruit bowl metaphor, AI is the ceramic apple. Almost indistinguishable, extremely useful, and totally hollow inside.
And we seem to be okay with that.
Because maybe the hunger we’re satisfying isn’t biological anymore. It’s cognitive. Emotional. Social. We want responses that feel human. We want interfaces that don’t glitch into awkwardness when we say something weird. And if AI can deliver that — the illusion of understanding, the performance of empathy — then we’re more than willing to engage with the fake apple.
Of course, not everything can or should be a simulacrum. If I’m being diagnosed with something serious, I want a human doctor with training, judgment, and the ability to recognize when something just feels off. If I’m in crisis, I don’t want an AI therapist parroting scripts. Some spaces need the real apple. The organic, flawed, fully conscious version.
But for other stuff? Give me the shiny ceramic one. I don’t need my travel itinerary to be handcrafted by a human. I don’t care if the help desk response came from someone with a heartbeat, as long as it fixes the thing. We’re already seeing this division happen in the way companies deploy AI. The human touch is now reserved for the edge cases, the escalations, the messy stuff. Everything else? Automated. Simulated. Fake — but functional.
That’s a huge shift. And it challenges a lot of the romantic ideas we have about authenticity. Especially in creative fields. The notion that something has to be born of struggle, of feeling, of “real” human experience to matter. But we’ve been cheating on that ideal for decades. Lip-syncing in concerts. Filters in selfies. Ghostwriters behind the scenes. AI just made the cheat code more accessible.
Is that a bad thing? I don’t know. It’s complicated.
Because even if we accept the ceramic apple, even if we love the way it looks, we sometimes still crave the taste of the real one. The randomness. The imperfection. The bite that surprises you. That’s the space where humans still shine. But maybe we don’t need to shine all the time. Maybe we let the simulacra handle the lighting setup while we focus on the spotlight moments.
What we’re witnessing is not the collapse of reality, but the outsourcing of the predictable. The delegation of pattern and form to systems that can replicate it faster, cheaper, and with less attitude. And as long as we stay aware of what we’re doing, that’s probably fine.
Just don’t try to eat the fake apple.
Because that’s where the danger lies. Not in the existence of simulacra, but in forgetting that they are simulacra. In trusting them to feed us when they’re really just for show. That’s when we get into trouble. When we mistake the illusion for the truth. When we ask AI to replace something it was never built to carry.
But as long as we remember which apple is which, and why we’re picking it, we can navigate this new world just fine.
And maybe even enjoy the view from across the room, where everything looks perfect, even if only one thing is real.