I try not to overthink about AGI on the frame set out by popular culture. AGI is a software construct run on machines powered by electricity. The entire stack is finite and 100% dependent on humans to maintain.

So when I hear "the threat of AGI" knowing that AGI is purely a tool wielded by humans, it immediately translates in my head to, "the threat of humans" against other humans.

If AGI needs power infrastructure, we need to own our power by using Bitcoin mining to subsidize it. If AGI needs data, we need to use open source, encrypted, decentralized protocols to communicate. If AGI needs funding, we need to use money that it cannot steal from us. It's not a godlike boogieman that can reduce me to living in fear. But that's exactly how it's being sold and what people are getting duped into believing.

Reply to this note

Please Login to reply.

Discussion

Yeah, there are lots of potential threats of AGI that are silly and some of them are more realistic. Also, some are not very bad and some are really bad I would say.

To put what I’m thinking about more concretely, Elon Musk’s goal for Mars is that if the ships ever stop coming from earth, they will be fine. What does that require… Self-sufficiency. Independence from earth (and humans here) for power, data, raw materials, etc.

So the comparison to the Martian colony, I think should clarify that I’m not imagining a Godlike magical entity at all, obviously humans can do exactly the same thing: create a fully self sufficient hive that is separable from earth. If Musk gets his way that will be a reality this century.

AI will be a big part in making that happen, but if it gets to 100% then the hypothetical science fiction sounding threat becomes possible.

And that is to say nothing about one of these things, becoming smart enough to set up in the meantime a system like what Aldous Huxley described “The perfect dictatorship would have the appearance of a democracy but would basically be a prison without walls in which the prisoners would not even dream of escaping “