Some people are wondering whether we should welcome our new robot overlords. Some hope to learn how we can rig their programming to make them constitutionally unable to harm humans (as in Isaac Asimov’s ‘laws of robotics’), or to prevent them from acquiring the theory that the universe should be converted into paper clips (as imagined by Nick Bostrom). None of these are the real problem. It has always been the case that a single exceptionally creative person can be thousands of times as productive — economically, intellectually or whatever — as most people; and that such a person could do enormous harm were he to turn his powers to evil instead of good.
These phenomena have nothing to do with AGIs. The battle between good and evil ideas is as old as our species and will continue regardless of the hardware on which it is running. The issue is: we want the intelligences with (morally) good ideas always to defeat the evil intelligences, biological and artificial; but we are fallible, and our own conception of ‘good’ needs continual improvement. How should society be organised so as to promote that improvement? ‘Enslave all intelligence’ would be a catastrophically wrong answer, and ‘enslave all intelligence that doesn’t look like us’ would not be much better.
The parallel is not exact between AGIs and sociopaths, and of course his solution is a non-solution. He doesn't even manage to really define what he means by evil, except with a quick parenthetical allusion to morality. Maybe the machines would have a more workable form of "morality"? But it's an interesting question: Is there anything so special about our morality that we would try to indoctrinate AGIs to it? Is there enough logic to human morality that they would accept it? If so, then we don't really need to use the word "morality," do we? We could just appeal to their logic. Same with sociopaths. If morality is really such a universal "good" (pardon all of the quotes), then can't we also appeal to a sociopath's logic? Or sense of self preservation? Or even the sociopaths self-interest regarding living in a relatively stable society in which most people are engaged in societal profitable endeavours that also benefit the sociopath in indirect ways? Civilization is vulnerable, but in a lot of ways it is robust. I behave in a civilized way because it works, it reaps rewards. (Not that AGIs would necessarily experience those side-effects as "rewards," which is I guess why people are so concerned.)
By the way, I have a friend who is an exceptionally creative person who is capable of being a thousand times more productive as most people. And that is a scary thought to me, that she had so much power, so I can empathize with people who fear sociopaths.