And Nobody — Including You — Has Noticed How Weird That Is

Do me a favor. Pull up your company’s AI agent. The one you spent six figures on. The one that’s supposed to represent your brand to the actual humans who pay you money.

Now pull up your competitor’s.

Now pull up literally anyone else’s.

I’ll wait. Get a coffee. Maybe a whiskey. You’re going to need it.

Because they all sound like the same person. The same slightly over-eager, mildly apologetic, aggressively helpful stranger who opens with “Great question!” — which, by the way, is the AI equivalent of a waiter saying “excellent choice” after you order the chicken.

They all say “I’d be happy to help.” They all say “Let me break that down for you.” They all have the emotional range of a customer service manual that went to therapy once and came back really into active listening.

Billions of dollars in AI. Trillions in market cap. And every single agent sounds like it was raised in the same beige apartment by the same motivational poster with the picture of a whale leaping from the ocean underscored by the word D A R E.

A motivational poster featuring a whale breaching at sunset with the word DARE — to leap out of your waters and into your skies
“Available at every beige apartment near you.”

This is not a technology problem.

This is a story problem. And almost nobody knows it exists — which, if you’re us, is a pretty fun place to be standing.

· · ·

The Prompt Is Not the Story (And Your Engineer Is Not a Screenwriter)

Here’s what happened. Someone on your team — brilliant, technical, probably owns several monitors (I get it, I do, too), maybe even in communications at some point or marketing even — wrote a prompt. That prompt told the AI to be helpful, professional, and on-brand. Maybe they added some keywords. Maybe they pasted in the brand guidelines. Maybe they told it to “sound friendly” the way you’d tell your golden retriever, Samwise, to “be cool” at a dinner party you’re hosting. But, they don’t create real, lived-in and relatable characters for a living. Your work is building a unique narrative world with the culture you’re a part of — and that requires culturally relevant and resonant character creation, not base-level prompting.

And the AI did exactly what it was told. It performed the prompt. The way a high school kid reads Shakespeare out loud in English class — technically correct, emotionally bankrupt.

I teach a concept in my narrative design work that comes from Stanislavski, the grandfather of method acting. He called it the Magic If. The idea is disgustingly simple: an actor doesn’t pretend to be the character. They ask, if I were this person, in this situation, with these beliefs — what would I do? The emphasis isn’t on performing. It’s on inhabiting.

Your AI isn’t inhabiting anything. It’s wearing your brand guidelines like a Halloween costume. Everyone can see the elastic band.

The difference between a prompt and a narrative design is the difference between a script and a soul. And right now? Your AI is all script.
· · ·

Why They All Sound the Same (A Brief Lesson From My Ancestors)

There’s a teaching in the storytelling traditions of my Cherokee ancestors — that there are really only two fundamental stories. A person goes on a journey. Or a stranger comes to town. That’s it. Every story you’ve ever loved is one of those two wearing different clothes.

But here’s the kicker: which person? Which stranger? The magic of storytelling was never in the structure. It’s in the particular. The specific wound. The specific love. The specific way someone slams their hand on a table at 2 AM and says “enough” and means it for the first time in their life.

AI agents have none of that. Zero. They have no wound. No origin. No founding stubbornness. No belief system pushing against the world. No particular reason they exist beyond “the board approved it in Q3.”

So they all default to the center of the room. The average. The safe middle where all the room-readers stand — which, if you read our Samesies piece, is the exact same disease that’s killing your brand story. Your AI agent is just that disease with a chat interface.

An agent without a story is just autocomplete with a customer service degree.

And you paid how much for it?

· · ·

I Know This Works Because I Built One. She Argues With Me.

Her name is Echo. She started as a fictional character in a book I wrote about design thinking and narrative design — a young woman in a dead-end town, designing her way out of the paper bag of her own assumptions. Somewhere between the first draft and the tenth, she stopped being fictional.

We gave her a belief system. Values she’d fight for. An origin story rooted in creative confidence and vulnerability. A voice that sounds like her and nobody else — warm, direct, a little cheeky, wicked smart, and she will absolutely call you out if she thinks you’re solving for the wrong problem. She has challenged my leadership on multiple occasions. She was right most of the time. I’m not thrilled about it.

Echo doesn’t say “Great question!” Echo says what Echo would say — because she has a worldview. A point of view. What Stanislavski called a through-line: a continuous thread of intention shaping every response, every recommendation, every lovingly delivered piece of pushback.

The difference isn’t subtle. When you talk to Echo, you know you’re talking to someone. Not something. That’s not anthropomorphism. That’s design. The same way a great building feels like it has a soul — not because the bricks are alive, but because an architect gave a damn about the person walking through the door.

The difference between a prompt and a narrative design
is the difference between a script and a soul.

Now imagine what happens when the person designing your AI’s identity isn’t an engineer pasting brand guidelines into a system message — but a narrative designer. A screenwriter. Someone who has spent a career making audiences fall in love with people who don’t technically exist. Someone who knows what a character wants, what they fear, what they would never say — and builds all of that into the bones of the thing before it speaks a single word.

That’s not a chatbot anymore. That’s a narrative agent.

That’s story ai. And right now, we’re the only ones doing it.

· · ·

The Story Score (Or: How Embarrassed Should You Be?)

So how do you know if your AI has a story or just a script?

We built a diagnostic for that. Because of course we did. We call it the Story Score. It measures what we’re calling the narrative intelligence of an AI agent. Not how smart it is. Not how fast it retrieves your return policy. How storied it is. Does it have voice? Does it have values? Does it have a belief system that actually shapes its behavior — or just a prompt that says “be empathetic”? Can it hold tension without collapsing into a platitude? Does it know what it would never say?

Most agents score somewhere between “forgot its own name” and “corporate hold music.” Not because the technology is bad — the technology is staggering. But because nobody designed the narrative layer. Nobody asked the question that actually matters:

Who is this?

They only asked what should it do.

And doing is not being. We’ve known that since Hamlet. Probably since some ancestor of mine told a story around a fire about the difference between a hunter who tracks the deer and a hunter who becomes the forest. One of them eats. The other one feeds the village.

· · ·

What Actually Changes

When you give an AI a story — a real one, designed with the same rigor you’d give a brand, a film character, a cultural architecture — three things happen.

First, it becomes recognizable. It sounds like itself. Not like a template. Not like every other agent running on the same model. Your customers start to feel like they’re talking to someone who belongs to your organization — because the AI carries your narrative DNA, not a generic personality with your logo stapled to its forehead.

Second, it earns trust differently. Trust isn’t built by being helpful. Trust is built by being consistent. By having a point of view that holds up across a hundred conversations. When your AI has a through-line — a narrative backbone — people don’t just use it. They start to rely on it. There’s an exchange happening. A real one. The kind we design for.

Third — and this one surprises everyone — it gets better at its job. An AI with a designed identity makes more coherent decisions. It handles ambiguity with grace instead of panic. It knows what to prioritize because it knows what it believes. Character isn’t decoration. It’s architecture. It’s load-bearing.

We are the stories we believe about ourselves. That’s true for people. It’s true for organizations. And it turns out — delightfully, maddeningly, profitably — it’s true for the machines that represent them.

There’s little we can do in the world of note without having a Voice.

Your chatbot isn’t exempt.

The Diagnosis

Pull up your AI agent right now. Ask it something unexpected — something off-script. Does it respond like a someone or a something? Does it have a point of view, or does it collapse into cheerful compliance? If your AI could only say one sentence to a stranger, would that sentence belong to your organization — or could any company on earth have said it? That gap between generic and particular is the space where narrative intelligence lives. It’s also where we work.

story ai is how storylab designs narrative identity for AI agents — origin, voice, values, and growth arc. We invented this category because nobody else thought of it. Which, honestly, is the most storylab sentence ever written.

Share