

The problem is that an AI built to maximize paperclips might conclude that converting the planet to paperclips is an acceptable cost of maximizing paperclip production. It might understand why humans think it’s bad to convert the planet, but disagree. It would need to be explicitly programmed to prioritize human life over paperclips.
otherwise we would just switch it off
If it were super-intelligent, it could probably trick us into leaving it turned on.



Reminds me of the old trick on HTML forms where you use CSS to make one of the form fields invisible to humans and reject any submission that filled in that field.