Avatar
Mike Brock
b9003833fabff271d0782e030be61b7ec38ce7d45a1b9a869fbdb34b9e2d2000
Unfashionable.

I don’t have a solution. But I can recognize the culture war is important, even if I don’t have a strategy for playing in that arena. 🤷‍♂️

I just don’t think were headed towards a “great awakening”, either. I don’t think a rapid revolution is upon us. Nor do I think it would desirable, as a catastrophic collapse would probably … be extremely dangerous to the entirety of world.

The more I think about the need for “freedom tech” in the world — a term I first heard for #[0]​ — the more I think the open internet is simply not something anybody should be taking for granted. Sure, China’s Great Firewall has leaks, and VPNs are a thing. But it’s an insane cat and mouse game, and the risks of detection are very high.

I propose that if we really wanted to have the power of mass organization in civil society to limit the ability of state’s to overreach, then we should really be looking at technologies like BLE to create open mesh protocols to make open communication possible, even in situations where the internet is completely locked down.

I think the culture war is as stupid as the next person. However, I do think that some people who decide to excuse themselves from the mainstream and operate only in heterodox communities, make the mistake of thinking the mainstream is unimportant.

One thing I’ll say about the culture warriors — in particular the more intelligent ones like say, Jordan Peterson — is they’re actually not *wrong* to understand the importance of mainstream culture and the way it pulls along our economy and politics.

It’s hard to accept this if you see the intensity of disreason, and the absurdity of the pure emotion of a lot of it. But it’s a real thing that is going to affect all of our lives, whether we like it or not. Which I think makes the culture war actually extremely important, even among those who ignore it for our own sanity.

Unfortunately, I think that’s exactly right. Also, unfortunately, a lot of really smart people who should know better, believe stoking this rage is morally virtuous, even though the anger may be justified and understandible, it may also be helping to bring us closer to the brink of existential threat.

This is actually always been true. Athough, as the world becomes more complicated due to our technological inventions, the uncertainty envelope is actually widening, commensurate with the possibility space that these scientific and technological achievements unlock.

Dan Gardner wrote a good book on this years ago, called Future Babble, which was actually a pretty important book for me, and was somewhat formative in me teaching myself not to speak with certainty about the future.

I think trying to make predictions about what will happen in the future along any lines: culturally, politically, economically or biologically in the Age of AI is a fool's errand. My basic assumption is everyone is wrong about everything on the 5-10 year time horizon.

No. Our natural state is to trust in-groups and distrust out-groups.

It's a well-understood phenomenon in sociology around group dynamics, and in political science. It's also well understood by autocrats and demagogues, who weaponize the fear of external threats to distract from their own corruption and tyranny.

Yeah. History is not encouraging, here. There seems to be one antidote to declining societal trust: a shared, external threat. Possibly among the most disturbing double-edged elements of human nature. 9/11 was the last time we saw something like that happen. And that social trust was hijacked by neoconservatives to launch the Iraq War and create the forever wars.

It seems to me the crisis of trust in society today, which I anticipate getting worse due to the misinformation amplifying potential of generative AI, is creating increasingly fertile ground for doom-mongering soothsayers to demagogue their way to being seen as credible voices with credible answers.