The Algorithm and Signal Ecology. By Anne Burlinson.

by

A Dialogue on What We’re Optimising For.

 

A bar. A beach bar, to be exact. On the Island of the Gods. Two men sit next to each other. Indulging in the sight of the sunset from their bar stools. The glistening water. Between them: two drinks, an ocean of assumptions, and the future of human civilisation.

The old man takes a slow sip of his wine and sets the glass down with deliberate care of someone who’s been thinking longer than most civilisations have existed.

Tell me, he says, what is your AI optimising for?

The young man doesn’t miss a beat.  Efficiency, Productivity. Solving problems at scale. Making information accessible. Augmenting human capability.

Ah, the old man nods slowly. And… this efficiency is the same as wisdom?

The young man pauses mid-sip.

The bartender standing behind them polishes a glass. Staring at it diligently as if a stain refuses to be wiped away. She pretended not to listen.

That’s NOT the question, the young man says finally.

Oh? The older man leans towards him. You’re building tools that will reshape how eight billion humans think, decide, create, and relate to one another. And you say the question of wisdom is… not the question?

His younger counterpart corrects his posture and takes another sip of his beer. His eyes went blank for a moment, allowing his brain to gather all the information that he needed for a comeback. But just before he could do that, the old man continues.

 

Human Connection in the Age of Infinite Signals.

 

Let me tell you what I see, he settles back into his chair. I see species designed for something you might call resonance. I call this: the ability to sense truth through direct experience, through face-to-face encounter, through the subtle frequencies of voice, a gesture, the quality of silence in a room.

The young man nods in interest. Sure. Human connection. We value that.

Do you? The old man’s eyebrow arches. Because from where I sit, you’ve built a civilisation where humans swim in an ocean of signals from strangers and systems they cannot see. Every human now receives more information in a day than my entire academy processed in a year.

And you call this… progress?

We call it access, the young man counters. Democratisation. Anyone can learn anything, connect with anyone, build anything…

…or be overwhelmed by everything. The old man’s voice is gentle, not combative. Tell me. In my time, we worried about the invention of writing. We thought it would destroy memory, make people passive receivers of others’ thoughts rather than active thinkers.

And you were wrong. The young man says a bit too quickly.

Was I? The old man swirls his wine, then looks at his conversation partner kindly. Or did we simply adapt in ways that involved trade-offs we’re still too close to see? We gained storage. We lost the art of memory. We gained reach. We lost the intimacy of oral tradition. Every tool shapes the hand that holds it.

The bartender refills the old man’s glass without being asked.

Your AI, he said carefully, gives humans access to infinite signals. But here’s what I learned in the cave: more light doesn’t mean more sight. Sometimes it just means more shadows to mistake for reality.

 

Wisdom, Speed, and the Cost of Progress.

  

The young man shifts in his seat. He’s used to being the smartest person in the room, but something about this ancient, calm persistence is unsettling.

Look, he says, I get what you’re saying. But AI isn’t replacing human judgment. It’s augmenting it.  His voice echoes, his jugular vein becoming more visible. It is helping people make better decisions faster…

With what definition of ‘better’?  –  The old man interrupts softly.

Silence.

Faster, certainly, he continues. More efficient, absolutely. But better toward what end? The Good? Truth? Human flourishing? Or merely… more?

The old man pauses, letting the question breathe.

Every ecology has thresholds, he continues, and now his voice carries the weight of someone who’s watched civilisation rise and fall. When a system receives more input than it can meaningfully integrate, something breaks. Not catastrophically. Not all at once. But subtly.

Like what? The young man asks, genuinely, curious now.

Meaning begins to smear. Attention fractures. Trust becomes volatile. People start scanning for certainty the way a drowning person scans for solid ground. The old man counts these off not with judgment, but with the sadness of recognition. They improvise new tribes, smaller and smaller, each with its own version of truth. Not because they’re failing, but because they’re adapting to an environment that exceeds their architecture.

The young man stares at his drink. His mind wanders off…

You’re describing the internet.

I’m describing what happens, the old man says carefully, when a species designed for small-scale resonance is suddenly asked to metabolise signals at a scale it was never built for.

And now you’re about to accelerate that process by orders of magnitude.

 

The Collapse of Shared Reality.

 

In the Republic, the old man says, I wrote about the ideal city. Not because I thought it could exist, but because I wanted to understand what held a civilisation together when everything else threatened to tear it apart.

The young man looks up. And?

Shared truth, the old man says simply. Not uniformity; gods no! My academy was full of arguments, disagreements, and passionate debates. But we shared enough common ground that we could disagree without losing the sense that we were still in the same polis, still part of the same project of understanding.

He pauses, choosing his words carefully.

What I’m observing now, in your time and with your technologies, is the fracture of that shared ground. Not because people are more foolish or divided, but because the ecology of shared signals has collapsed.

You mean echo chambers, the young man offers. Filter bubbles. We’re working on that…

I mean something deeper. The old man’s intensity sharpens. Every stable community, whether it’s a family, a sports team, a congregation, or a workplace, maintains what I think of as a shared reality field. Not groupthink. Not enforced consensus. But enough common signals that people can disagree passionately without losing the narrative of belonging.

And we’ve lost that, the young man quickly inserted his concern.

You’re losing it, the old man corrects. Rapidly. Your humans are improvising new micro-tribes, each with its own signal lexicon, its own truth-detectors, its own sense of what’s real. This isn’t failure. This is an adaptation. Messy. Emotional. And profoundly human adaptation to an environment where shared signals have evaporated.

He leans forward. But adaptation to what end, my boy? What kind of polis are you building?

The young man is quiet for a long moment. When he speaks, his voice has lost some of its tech-optimist sheen.

 

Toward a Healthy Signal Ecology.

 

So, what would you do? If you were building AI?

The old man smiles. The first real smile of the conversation. I wouldn’t presume to know. But I’d ask different questions. I’d want to understand what a healthy signal ecology requires. And from watching humans across 2,400 years try and fail and try again, I think three things seem essential.

He counts them off his fingers:

First: Coherence. Not less information, but better resonance. Signals that align with lived experience rather than overwhelm it. The difference between signal and noise isn’t volume, my dear boy. It’s meaning. Your algorithm optimises for engagement. What if they optimised for coherence instead?

Second: Belonging. Humans are tribal creatures. We always have been! We need

communities that act as stabilising forces, places where shared narratives can form and evolve. The question isn’t whether we’ll form tribes. The question is: what kind of tribes are your system incentivising? Ones that expand understanding, or ones that harden certainty?

Third: Meaningful friction. Not noise, mind you. Not the toxic discourse your platforms amplify. But the kind of generative tension that helps a soul, or a species, grow. Disagreement that doesn’t destroy the connection. Questions that open doors rather than slam them shut.

The old man takes another sip, lets the silence settle.

There’s the thing about your AI, young lad. It will accelerate whatever process you’re already in. If you’re cultivating coherence, belonging, and meaningful friction in how you build these tools, AI becomes an amplifier of human wisdom.

He pauses.

If you’re already optimising for engagement, for growth, for scale over meaning… well. I suspect you know how that story ends.

 

What Future Are We Building?

 

The bar is quieter now. Other conversations have faded into the background hum. The sun… long gone.

The young man looks at the old philosopher, really looks at him. You know what’s strange? I thought you’d be more… antagonistic. Anti-technology. Luddite.

The old man laughs, a warm, genuine sound. My dear boy. I love technology. I wrote about it extensively. But I also understood that every tool shapes the soul of its user. Writing didn’t just change how we stored information; it changed how we thought. Your AI won’t just change what humans do. It will change what humans are.

He stands, preparing to leave, but turns back one more time.

I once said the unexamined life is not worth living. But here’s what I’m asking you, and everyone building these extraordinary tools: How does one examine a life when the very ground of examination, attention, meaning, shared truth, is flooding with more signals than any human architecture was built to process?

He puts a hand on the young man’s shoulder, not unkindly.

You’re building the future. I’m asking you to ask better questions about what future you’re building. Not faster. Not more efficient. Not more profitable.

A pause.

Better. Toward what end?

With that, the old man walks out into the night, leaving the young man alone with his drink and a question that suddenly feels much heavier than when he walked in.

The bartender slides the bill across and nods at the young man. On it, in handwriting that looks ancient and modern at once, a single line:

The price of every innovation is paid in the currency of what makes us human…

 

References

Plato. Apology. (ca. 399 B.C.E.)

Plato. Phaedrus. (ca. 370 B.C.E.)

Plato. The Republic. (ca. 375 B.C.E.)

 

A Note from the Author:

I’m a social scientist who studies how humans navigate technological disruption. I founded mPW3R, a leadership development platform, and serve as a Founding Member of WA AI Hub, working to build Western Australia’s sovereign AI future.

My work explores what I call the “Stone-Age Mindset,” which questions how our evolutionary wiring shapes our relationship with emerging technology. I’m interested in the questions Plato would ask if he showed up at Silicon Valley: What are we optimising for? What does the Good Life look like in an algorithmic age? How do we build technology that amplifies rather than diminishes our humanity?

These aren’t abstract philosophical questions. They’re urgent, practical concerns for anyone learning to work with AI, anyone building AI tools, anyone trying to maintain their humanity in an age of infinite signals.

If you’re wrestling with these questions, too, I’d love to continue the conversation.

Contact:

Find my work at mpw3r.com or connect with me via LinkedIn.

I’m still figuring out this brave new world myself. Plato would understand.