Good interview with Balaji here. More in depth than the usual interview, because it tended to stay focused on AI, although it is Balaji so there are tangents.
Strong push-back against AI doom. By this I don't mean AI not being very disruptive of society, but the idea that idea of a Terminator scenario is very small.
- Blue America wants to build a "Software FDA". They have got themselves into a position where all technological change hurts them. They are the new conservatives.
- Bitcoin at around 200K, means 1/3 to 1/2 of billionaires are crypto billionaires.
- Regulation of nuclear power resulted in nuclear weapons, but not cheap electricity. Worst outcome.
- Limits on AI imposed by math and physics. Chaos theory limits forecasting ability. Cryptography still applies to AI.
- What are sensors and actuators that AI uses to kill us? "You can just pour water on it." is more meaningful than it sounds. AI either needs to hypnotize humans, or it needs a fleet of drones that are not cryptographically controlled by humans to take over.
- We will have amplified intelligence: Human + AI.
- Future political axis: Uncle Ted (Kaczynski) vs Uncle Fred (Nietzsche).
- Even if you believe caution on AI is deserved, America can't do it. America has gone from winning everywhere without fighting, to fighting everywhere without winning.
David Sachs with a good call here back in April. He states we are currently on step 7 with Ukraine.

One thing to keep in mind is that the LLMs are surprisingly small for what they do. Currently takes lots of computing to make them, but once you make one, free to pass around on a flash drive. 500GB range.
Yes, that would be more my concern as well. I believe/hope that ai's become very cheap and thus widely distributed.
If someone tells you that censorship is benign, show them this video.
https://video.nostr.build/ed2cec286ebdd2dd07a6afeff25310defb3b85a442058911cca99a5816c96e81.mp4
Now I'm wondering what else I didn't pick up on as a kid. 👻
Schadenfreude
The probability of AGI ending us is negative.
nostr:naddr1qqxnzdesxymnsvee8qmnjdp4qgsvg03c9mjgx5qsh8adrrs2dag0rts58wvwpzdchwt5yvhuungl99grqsqqqa28c9vy9k
A post I wrote on this today. The calculations of the doomers are flawed.
I concur, really nice.
Yeah, they did a great job. Particularly with the purchasing of Sats.
When you do science, art, or mathematics, you are generating or improving on a virtual reality.




