I have so successfully scared the hell out of myself, going into the alignment problem and control problem with AI, and all I have to say right now is "why are we trying to develop AGI, again?"
The development of stuff that closer approximates AGI is driven by profit incentive for companies and power for nations.
Please Login to reply.
Yeah. It's created an insane, global, prisoner's dilemma.
Dufug is AGI?
Artificial general intelligence
Are you familiar with Daniel Schmachtenberger’s thinking around existential risk?
Yes!