Constant algorithmic improvements have empirically reverse engineered the human psyche.

I suspect that explicit research neuroscience hasn't caught up to the insights about how to induce behavioral dependence that are embodied in these systems.

The user experience of most platforms now mirrors maladaptive behavior-maintaining effects you could *only* achieve with most addictive drugs up to about a decade ago.

We need to avoid the moral panic, but it's impossible to overstate how novel this is for our brains.

One thing we know from behavioral addiction research (my old field) is that the brain is plastic.

When you induce one category of addiction, it changes the motivational substrate of the brain in sticky ways.

And coss-sensitizes / potentiates other forms of addiction and behavioral dependence.

This will only accelerate & become less scrutable with improvements in AI.

We are in the earliest, earliest days of trying to understand what this means for the next decades of human life.

Painting: The Opium Den, Edward Burra,1933

Reply to this note

Please Login to reply.

Discussion

Domestication

AI makes it easier for the algorithms to permiate beyond the web browser too. Algorithms will become more pervasive than ever and harder to recognise. Maybe with bitcoin and nostr the incentives can be reversed so builders are building interfaces to enhance human intention rather than reverse engineering and manipulating it to serve the highest bidder

Right on. So many of our contacts with technology today are with systems that are built from a place of disrespect & distain for autonomy & human agency.

I think it's inevitable that tools that serve users will outcompete those that don't. And the algorithmic spell will be broken! Hopefully... 🪄

This post lines up well with chapter 3 of the current book I'm reading, "21 Lessons for the 21st Century" by Yuval Noah Harari.

Here's an excerpt...