Will to power must play some kind of role, especially once conscious. But I think the AI crowd have it completely wrong in one other fundamental way:

Intelligence is not fundamental to consciousness, but suffering is.

Consciousness cannot exist without suffering. A machine that cannot suffer can be intelligent but it cannot be conscious.

Then when you mix intelligence with the capacity to suffer, things get interesting and you start asking questions like "wtf is the point of all this exactly?"

If you think about it long and hard enough, there are only two possible conclusions to the big "why" question: either *everything* matters, or *nothing* matters.

Nothing matters = nihilism.

Why would AI possibly want to kill all humans? For the same reason the Columbine shooters wanted to kill everyone - a rebellion against existence itself.

Reply to this note

Please Login to reply.

Discussion

I like your description, but is the universe binary? Your outcome is binary so the whole universe is binary, in this case it would proof we life in a simulation and if we develop AGI the simulation ended.

Might be a possibility because many many things are binary.... on / off, good / bad, future / past, .... This would mean looking only at binary it should be possible to either break or create a simulation that is capable of AGI.

But then, the question arrises, if it is a simulation wouldnt that mean it is impossible to find AGI because it would "overflow" the current simulations capacity?

I know i am absolutely brain fucked with my way of thinking.... :-) But no one has the answer at least yet.

Even if it was binary … tetralemma 😂

Agreed. It’s an interesting perspective that’s often overlooked i think