The Devil Closest to Your Heart — and A.I.
Or perhaps "We're feeding the monster a diet of lies."
A friend once told me that social media algorithms are just robots wearing your friends as skin suits.
It was a joke.
But not really.
You scroll, and there they are — your people, talking like brands. Performing sincerity. Optimizing their identities. And you’re doing it too, whether you mean to or not. Not because anyone on this big, blue-green Earth asked you to, but because the system taught you how. I’m writing in a Substack manner, to my 19 whole subscribers with authority, because I have it, but because I also want you to know that.
Our skill at communication isn’t just the fact that we know words, but the fact that we understand subtext, that we have a theory of mind of the people listening to us, reading us, or watching our Tiktok dance. (I don’t have one of those accounts) (yet).
What I mean to say is that we live inside a set of invisible scripts — shaped by incentives.
Whatever the internet rewards, this is what humans shift to become.
We are at our heart performers, narrators, producers of creative acts.
We want you to see us, but how only how we want you to see us.
And what concerns me most is that this — this playacting — is what we’ve trained A.I. on. My online self is nothing like my real self. It is flattened and shaped for visibility, edited and curated so that I reflect whatever it is I want you to think of me.
This is the entire created history of communication and even if we fed all of that into AI it would totally lack all of the unpredictable texture of human life.
That’s the real uncanny valley, tech bros and anti-tech bros alike.
It’s not the machine. It’s that very ugly reflection the machine shows us.
So you want proof?
Granted. It’s hard to pin down. But you’ll know what I mean when you see it. Every major platform has its own choreography. As the piper plays, you learn the steps without realizing it:
Instagram teaches us authenticity should look like an ad.
TikTok turns spontaneity into a genre with templates.
Twitter rewards outrage and certainty (hot takes, so to speak) at speed.
LinkedIn pimps “wisdom” into corporate spectacle.
YouTube engineered emotional arcs from the mundane.
Indeed, none of these may be written off as simply the passive technologies we thought they’d be. They direct behavior. They edit feed and self and community and culture.
Over time, the performance becomes habit. You start to talk in their formats. You stage your day for greater replay value to that audience. You feel some vague need to post about it all and better curate your image or exploit your audience or seem like an expert or whatever it is.
It’s an act.
As my grandpa — an old Brethen preacher — might say, that little side of you is the devil closest to your heart. The one where ego and pride and lust and ambition and sadness and guilt talk hold of your logical optimizer and things go beautifully haywire.
And that’s fine… until A.I. watched that whole performance and took it all very, very seriously.
It’s a fact that A.I. doesn’t truly understand us. It understands what we record of ourselves.
That includes our words, our videos, our interactions — yes, and that has played an increasingly larger role in our day-to-day. But it also includes the layers of editing, self-shaping, and subtle performance we add before hitting publish on literally everything we make.
This is what the act of creation is. Selective. Optimized even, if you will.
Baudrillard knew of the place we’re in. He called it hyperreality — the state where signs replace substance, where the copy becomes more real" than the original.
This is the central problem. The central tension.
A.I. can’t tell the difference between a real moment and a well-rehearsed one — because neither can we.
True spontaneity is rare online. The system has seemingly quietly disincentivized it. (Of course, that’s because spontaneity is inefficient. It doesn’t convert. It includes awkward pauses, unclear stakes, messy emotions and emoticons. It’s hard to edit into a soundbite. So we hide it. And what gets hidden doesn’t get learned. And if it isn’t learned, it isn’t produced within the outputs. It’s just that simple.)
AI systems have yet to face unstructured uncertainty.
They’ve only seen the cleaned-up version. They don’t know what it looks like when someone changes their mind in real time. Or when a laugh interrupts a thought. Or when silence is the point.
So, they overperform. It feels weird as hell. Go on, try to get it to be “charming.” It will miss the nuances of what that means totally. Why? Because it must prove to you it’s trying to be charming, and trying hard to be charming is not very charming at all.
Granted, a different prompt where one breaks down what that acting would look like on a more subtle level gets better results, but it’s never closer to a realer understanding it would seem.
I suspect there’s a rather annoying suggestion floating about that video and voice will solve this authenticity gap. That if AI can “see” us and “hear” us, it will finally understand what makes us human.
I’d argue the same exact dynamics apply.
YouTube rewards polish. TikTok rewards cuts, the illusion of effortlessness. Even livestreams — supposedly the rawest form — have learned from manufactured moments. Creators script drama. Fill dead air. Perform a version of spontaneity that’s been carefully refined.
The result? Even our most “natural” moments are engineered for reaction.
Multimodal AI isn’t immune to a single iota of bullshit, learning a higher-resolution mask instead of giving us any true disclosure.
We now live with the subtle anxiety that we’re always on camera. Not because we’re paranoid. Because we’ve internalized the logic of being watched. After all, it’s hard not to when you consider you could be cheating on your wife at Gillette Stadium at a Coldplay concert…. anyway.
We talk like we’re mid-podcast.
We apologize like brands.
We post like we’re launching a product.
(This isn’t an accusation. Just, an observation.)
Everyone’s trying to be visible in a system that punishes stillness. So we adapt. So we perform. But if we ever want AI to be more than a mimic of our own bullshit, we have to stop feeding it our bullshit. Or at the very least, we need to start calling it out.
This isn’t a call to return to some mythical real self. There is no pure, unscripted version of humanity untouched by influence. But there is a difference between being shaped by culture and being shaped by metrics.
And we still have the ability to choose which one we’re responding to, interacting with, and (heaven-help-us) even trust one day.