Replying to Avatar Lyn Alden

When it comes to AI, philosophical people often ask "What will happen to people if they lack work? Will they find it hard to find meaning in such a world of abundance?"

But there is a darker side to the question, which people intuit more than they say aloud.

In all prior technological history, new technologies changed the nature of human work but did not displace the need for human work. The fearful rightly ask: what happens if we make robots, utterly servile, that can outperform the majority of humans at most tasks with lower costs? Suppose they displace 70% or 80% of human labor to such an extent that 70% or 80% of humans cannot find another type of economic work relative to those bots.

Now, the way I see it, it's a lot harder to replace humans than most expect. Datacenter AI is not the same as mobile AI; it takes a couple more decades of Moore's law to put a datacenter supercomputer into a low-energy local robot, or it would otherwise rely on a sketchy and limited-bandwidth connection to a datacenter. And it takes extensive physical design and programming which is harder than VC bros tend to suppose. And humans are self-repairing for the most part, which is a rather fantastic trait for a robot. A human cell outcompetes all current human technology in terms of complexity. People massively over-index what robots are capable of within a given timeframe, in my view. We're nowhere near human-level robots for all tasks, even as we're close to them for some tasks.

But, the concept is close enough to be on our radar. We can envision it in a lifetime rather than in fantasy or far-off science fiction.

So back to my prior point, the darker side of the question is to ask how humans will treat other humans if they don't need them for anything. All of our empathetic instincts were developed in a world where we needed each other; needed our tribe. And the difference between the 20% most capable and 20% least capable in a tribe wasn't that huge.

But imagine our technology makes the bottom 20% economic contributes irrelevant. And then the next 20%. And then the next 20%, slowly moving up the spectrum.

What people fear, often subconsciously rather than being able to articulate the full idea, is that humanity will reach a point where robots can replace many people in any economic sense; they can do nothing that economicall outcomes a bot and earns an income other than through charity.

And specifically, they wonder what happens at the phase when this happens regarding those who own capital vs those that rely on their labor within their lifetimes. Scarce capital remains valuable for a period of time, so long as it can be held legally or otherwise, while labor becomes demonetized within that period. And as time progresses, weak holders of capital who spend more than they consume, also diminish due to lack of labor, and many imperfect forms of capital diminish. It might even be the case that those who own the robots are themselves insufficient, but at least they might own the codes that control them.

Thus, people ultimately fear extinction, or being collected into non-economic open-air prisons and given diminishing scraps, resulting in a slow extinction. And they fear it not from the robots themselves, but from the minority of humans who wield the robots.

If history is any guide, the first phase will be similar to the industrial revolution -technology will complement certain types of workers, making them more productive, while replacing others entirely. But if AI advances to the point where even high-skill labor is no longer necessary, we reach a stage unprecedented in human history: a world where ownership of automation is the sole determinant of economic power.

Imagine a scenario where AI-driven corporations, controlled by a small group of capital holders, optimize every aspect of production, logistics, and service industries. Governments, pressured by economic efficiency, privatize social services, making access to resources contingent on corporate governance rather than state policies. In this world, the traditional idea of employment vanishes for most. Instead of wages, former workers survive on universal basic income or corporate stipends, tied not to productivity but to compliance with the systems owned by the elite.

History suggests that once a class of people is economically unnecessary, they become politically vulnerable. The landed aristocracies of the past had use for peasants as laborers, but what happens when even the illusion of economic necessity disappears? In previous centuries, displaced workers could riot, revolt, or demand redistribution, but in a world governed by automated systems and AI-controlled security, resistance itself could become obsolete.

The darkest outcome isn’t violent suppression but a slow, passive neglect—the emergence of a “post-labor caste” that, lacking any economic leverage, is maintained at a subsistence level only as long as the ruling class finds it convenient. Perhaps they are given digital entertainment, AI companions, and just enough resources to avoid rebellion, but they remain permanently outside the sphere of influence, their fate determined entirely by those who own and control automation. Think of animals in world dominated by humans…

Its evolution and survival of the fittest again.

Reply to this note

Please Login to reply.

Discussion

No replies yet.