If we get to a point where humans depend on AI for everything, following every piece of advice it provides because “it knows best”, causing us to eventually lose our own intelligence, who’s the actual robot in this situation?
Discussion
The calculator didn't kill math
While that may be true, I bet if we asked the average person to solve a complex math problem without a calculator, most would have a hard time solving that by hand. They end up depending on the calculator and assume that it gives them the right answer. I’m wondering if we eventually do the same when we ask AGI questions?
It’s like speaking a 2nd language. “If you don’t use it you lose it”
I think AGI is pretty unlikely. But for AI I think the calculator is a perfect metaphor.
I see the scenario you're speaking about as an example of specialization and I think specialization is the magic that makes market economies work.
Why should we all be able to solve a complex math problem on paper if there is a more efficient method? And in the hands of the people specializing in math, the calculator has turbo charged productivity and advancements.
I see LLMs enabling the exact same kind of specialization but in the realm of language, which will be a good thing for the species