Finite Minds in Expanding Systems

6 minute read

Published:

Foreword

Douglas Adams once wrote in The Salmon of Doubt (2002):

I’ve come up with a set of rules that describe our reactions to technologies:

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you’re thirty-five is against the natural order of things.

On AI Agents, Anxiety, and the Pace of Change

Recently, I have been frequently encountering discussions about AI agent tools such as OpenClaw. People speak about building digital employees to increase productivity, or deploying autonomous systems to perform high frequency micro arbitrage on platforms like Polymarket.

At times, I find myself wondering whether I am slowly falling behind the pace of this era.

The essay Steam, Steel, and Infinite Minds offered me inspiration. It reminded me that technological expansion has always reshaped human capability. Yet inspiration does not eliminate unease. Understanding a trend intellectually does not remove the emotional lag that follows it.

Am I falling behind?

Intellectually, I understand what is happening. Technological progress compounds. Tools abstract effort. Systems build upon systems. But emotionally, something does not move at the same speed.

Perhaps this is not a technological problem.
Perhaps it is a cognitive one.

1. Exponential Minds in Compressed Time

Human cognition evolved in environments where change was gradual and local. Our intuitions are calibrated for linear progression.

Walk faster and arrive sooner.
Work more and produce more.
Learn steadily and improve steadily.

Exponential systems violate these intuitions.

In exponential growth, early stages appear insignificant. Middle stages surprise us. Later stages overwhelm us. The difficulty is not that we cannot calculate exponential curves. The difficulty is that we cannot internalize them emotionally.

What feels destabilizing today is not only exponential growth, but temporal compression.

Technological waves once unfolded across decades. Steam, steel, electricity, the internet. Each transformation allowed time for society to metabolize change.

Now cycles feel measured in months, sometimes weeks.

A model is released.
Tools are built upon it.
Agents are constructed from those tools.
Autonomous systems begin interacting with markets.

By the time one layer is understood, another is already operational.

The anxiety does not arise from ignorance.
It arises from velocity.

Our brains expect incremental change.
Our environment delivers compounding acceleration within compressed time.

Emotional dissonance emerges when cognitive adaptation lags behind systemic iteration.

2. Identity Friction in the Age of Cognitive Automation

On an emotional level, part of me still operates within a world model in which human effort is the primary unit of production.

Even if I understand AI agents intellectually, my deeper assumptions were formed in a reality where value creation was inseparable from human labor.

When digital employees begin competing in efficiency, scale, and even decision making, that older mental model begins to wobble.

The unease is not necessarily fear of replacement. It is the destabilization of previously reliable assumptions about where value originates.

Historically, automation displaced physical labor first. Today it moves into symbolic and intellectual domains, areas closely tied to identity and self worth.

When tools perform tasks once intertwined with personal competence, the boundary between assistance and substitution becomes psychologically ambiguous. Even if no displacement occurs, the internal hierarchy shifts.

Human beings are sensitive to hierarchy. We anchor identity to competence within a given era.

This tension is not fundamentally about age. It concerns identity stability.

When the rate of external change exceeds the rate at which identity reorganizes itself, friction emerges. That friction is often interpreted as falling behind. It may simply be the nervous system resisting rapid structural reconfiguration.

Identity friction, more than technological ignorance, may be the true source of anxiety.

3. Visibility, Urgency, and the Myth of Falling Behind

Another distortion operates beneath this anxiety. It is visibility bias.

We are constantly exposed to frontier level innovation. Researchers release new models. Traders deploy autonomous systems. Founders automate entire workflows. These examples dominate headlines and conversations.

Visibility, however, is not universality.

Most of society operates within a slower equilibrium. Institutions, industries, and individuals adapt gradually. The majority of economic and social structures do not reorganize at weekly cadence.

Media and networks amplify frontier behavior. The human mind mistakes amplification for representativeness.

What is rare appears common.
What is experimental appears inevitable.
What is emerging appears mandatory.

Perceived urgency may therefore exceed actual necessity.

When we say that we are falling behind, we often mean that we are not positioned at the frontier. Yet the frontier has always moved faster than the average individual. That has not changed.

What has changed is our proximity to it.

We observe the cutting edge in real time. We witness iteration cycles that were once invisible. We compare ourselves not to our peers, but to pioneers.

The gap between builders and observers becomes psychologically inflated. That inflation is misinterpreted as personal inadequacy.

Perhaps the healthier question is not, “Am I behind?”

A more grounded question might be, “What level of engagement aligns with my goals and values?”

Not everyone needs to build the frontier. Not everyone needs to speculate on it. Not everyone needs to operate at maximum technological frequency.

Adaptation does not require total immersion. It requires recalibration. It requires updating mental models while preserving core identity.

Tools evolve. Cognitive structures endure.

Anxiety may arise not from technological insufficiency, but from confusing visibility with obligation.

4. Finite Minds in Expanding Systems

There is a quiet paradox at the center of this moment.

We are finite biological minds living within systems that expand, iterate, and amplify at digital speed.

Acceleration is external.
Assimilation is internal.

Assimilation takes time.

Technological systems scale without fatigue. Human cognition reorganizes slowly. Expansion unfolds publicly. Adaptation unfolds privately.

From close range, acceleration feels overwhelming and destabilizing. From a wider perspective, technological revolutions are phases within a longer continuity.

Steam did not erase craftsmanship.
The internet did not erase authorship.
Artificial intelligence will not erase thought.

Forms will change. Processes will compress. Patterns will automate. Yet meaning, the act of interpreting and orienting oneself within change, remains human.

The discomfort we feel may not signal inadequacy. It may represent the natural friction between finite minds and expanding systems.

Not incompetence, but latency.
Not failure, but integration time.

Douglas Adams once suggested that our reactions to technology reveal more about our position in time than about the technology itself. Perhaps that is the most stable perspective available to us.

We are not necessarily behind.

We are simply located somewhere along the curve.


Written in Chiba, Japan
February 2026
Song Wang