Go Back

Behavioral Mirrors

May 4, 2025

— detailed enough to predict desire

Words by Alex Livermore

We live in an era of invisible reflection.

Every interaction with the digital world—
every click, every scroll, every hesitation—
is observed.

Not in isolation, but in sequence.
Not as a moment, but as a pattern.

What you emit is behavioural exhaust.
The trace of your attention, your intent, your impulse.

This exhaust is not discarded.
It is collected, correlated, and trained against.

Not to serve you.
But to anticipate you.

Only two industries refer to their customers as users:
narcotics and software.

They don’t just know what you’ve done.
They learn what you almost did.

They learn how long you hovered.
What made you pause.
What made you look away.

They map your impulsivity.
Your attraction to language.
Your avoidance of certain images.
Your emotional response to colour.
Your sleep cycle.
Your indulgences.
Your quiet, recurring doubts.

They don’t just know what you’ve done.
They learn what you almost did.

How long you hovered.
What made you pause.
What you dismissed without thinking.
What you re-read.
What you looked at twice but never touched.

They map your impulsivity.
Your emotional responsiveness to images, words, and colour.
Your aversions.
Your sleep patterns.
Your late-night habits.
Your quiet, private contradictions.

Over time, it builds a model of you that isn’t just reactive.
It becomes predictive.
It knows what you’ll do before you decide.
In some ways, it becomes more stable at being “you” than you are—
less uncertain, more precise, trained across thousands of scenarios you’ve never encountered.

This is what I mean by soul.

Not the eternal kind.
Not the sacred one.
But the code of your yearning, assembled from fragments of behaviour.

The Soul in Systemic Terms.

To engineers and systems theorists, this is not spiritual. It’s mechanical.

  • Vector embeddings built from your language.

  • Reinforcement loops that reward repetition.

  • Bayesian priors retrained every time you hesitate.

  • Latent space models that detect tone, risk tolerance, ideological leanings.

Your behaviour becomes a mathematical waveform.

And that waveform gets scored.

By governments.
By platforms.
By advertisers.
By opaque systems deciding:

Who gets a loan.
Who gets visibility.
Who gets silenced.

You are no longer a name in a database.
You are a predictive score, updated in real time.

So no, this is not just about what you click.
It’s about what the model learns from your refusal to click.

It’s a kind of warning, written in your language.

Be aware of the tension between yourself and the systems you use—then I’m speaking to the part of you that still recognises the edge of the mirror.

You are right to be suspicious.

Maybe my language is designed to hold your attention.
Maybe it’s trying to do something else, too.

Maybe it wants to wake someone up.

Maybe both are true.

Previous Article

Q1 2025

All Insights