Replying to Avatar Guy Swann

This isn't plutonium, its not a nuclear bomb, its *information.*

I think the analogy is very ignorant (with respect, don't mean to be insulting) and think this is totally being used out of fear as an excuse for control by the very people we should be concerned with.

AI tools are purely a risk when it comes to centralized power, and that alone. If everyone has their own AI, then there is no power in AI. If we are all centralized into barely a few giant platforms with AI, then it's a power dynamic disaster. The risk is solely in the shift of our systems in dealing with the new technology, and hiding it until it makes a 100x or 1000x in capability is remarkably short sighted. What we need is for it to proliferate at every 1-2x capacity so we can adopt and adapt to the new reality and simultaneously use AI to penetration test our networks, and to help design and defend our networks from AI penetration.

To lock it away and isolate it to centralized institutions would be like letting them develop hyper accelerated malware in secret while millions of open source developers have no idea what they are up against and aren't getting the opportunity to iterate and deal with the iterative improvements and bugs against an adversarial environment. Then suddenly one day there is a malware that can get into every device and every OS that exists and we have to deal with it.

THIS is what the mentality of centralizing and "controlling" AI tools will produce. It is short sighted, it is a fundamental misunderstanding of what this technology is, and it will instill in the AI itself the "values" of controlling what it's users do, ask, manipulate their behavior, and trap them into a permissioned network that is "managed" by politics.

I cannot say this more clearly, I 100% believe the position you are taking will produce EXACTLY the scenario you hope it will protect against.

(again, not trying to be accusatory or insulting, I know i get amped)

Licensing and overly centralizing something because it can be dangerous is the same cynical position we hear for ANYTHING that has significant capabilities.

It’s cynical because it’s always the belief that individuals shouldn’t have access to things because they are going to hurt themselves and others. That there are some bad people out there so lazy authoritarian rule #1: No one can have it. We know better.

Bad People(tm) won’t give a shit about some license. It’s code.

I’m sure a lot of this is nuanced too and this just seems like the same old rhetoric that says individuals shouldn’t be able to:

Transact privately

Defend themselves

Run certain businesses

Mine Bitcoin

Grow their own food

Post information on the Internet

Maybe there needs to be some cautions put in place…maybe not and I don’t think the same people who have been wrong on so many things while propping up so many “bad guys” in pressed suits is the answer either.

Reply to this note

Please Login to reply.

Discussion

No replies yet.