I'm not sure how we'd even design such a test. We try to apply the same metrics for human intelligence to machines when they simply aren't the same. The test should probably be comparing the machine to other machines and former versions of itself to measure evolution and capability. It doesn't really matter if we control the machine and it meets some need. If we can't control it and it does things for its own ends, then that would be a pretty good indication it's intelligent by human standards (in my opinion). Maybe intelligence is a spectrum.

Reply to this note

Please Login to reply.

Discussion

No replies yet.