I've always suspected that humans who talk about AI superintelligence do not understand what intelligence is. They imagine a linear scale (because of this apparently measurable thing called I.Q.) and just extend it, and talk in terms of super high I.Q.s. But I'm pretty sure intelligence doesn't work that way. I think intelligence is just an amalgum of thousands of very independent highly refined capabilities and that there is no such thing as generalized intelligence. We don't see them as separate capabilities because there was no evoluationary pressure to do so. The upshot of that is that human intelligence is very specific to our biology and what natural selection made us, and artificial intelligence modelled off of human intelligence isn't something that can do much beyond what we already do (other than speed of processing). Anyhow, roughly, that has always been my take.