When it comes to end of the world scenarios, there's plenty to pick from. One mad-scientist creation that is particularly frightening is the running amok Frankenstein's monster. I just read the first chapter of a book called Our Final Invention. You can read the introduction and the first chapter for free on your Amazon Cloud Reader. What struck me as I read it, was the number of times the term "want" was anthropomorphically tossed into the mix.
AI theorists propose it is possible to determine what an AI's fundamental drives will be. That's because once it is self-aware, it will go to great lengths to fulfill whatever goals it's programmed to fulfill, and to avoid failure. Our [Artificial Super-Intelligence (ASI)] will want access to energy in whatever form is most useful to it, whether actual kilowatts of energy or cash or something else it can exchange for resources. It will want to improve itself because that will increase the likelihood that it will fulfull its goals. Most of all, it will not want to be turned off or destroyed, which would make goal fulfillment impossible. Therefore, AI theorists anticipate our ASI will seek to expand out of the secure facility that contains it to have greater access to resources with which to protect and improve itself.I can envision an artificially super-intelligent computer that could make associations with much greater speed and facility than could any human. I can imagine an ASI that when given a problem to solve would be able to seek and analyze every scrap of data about the particular problem that the entire history of mankind had amassed, wherever that data was stored, around the entire world, and able to do so in mere seconds or even microseconds! Okay this computer is super-duper smart. But what I can't imagine is that the computer actually has the ability to "want" anything. It doesn't have any natural drives. It doesn't feel. I think the ability to feel is a prerequisite for self-will.
If you don't feel pain or pleasure you won't be able to avoid one or seek the other. If you never felt hunger or thirst, why seek nourishment? I don't believe it's possible for sentience to exist without the ability to feel. In Descartes' famous thought experiment he wondered what it would be like were he unable to see, hear, smell, taste, feel anything at all. He pondered how he would even know that he existed. This was when he came up with his famous and so-simple test: "cogito ergo sum," I think therefore I am. Yet Descartes left out of his experiment the fact that his brain was stuffed with the memory of a life full of seeing, hearing, tasting, smelling, and feeling. Perform for me please this same experiment using a newborn with no senses at all. This empty baby-brain having all this possibility, yet not the slightest scrap of sensory perception would think exactly what?
There may someday exist an artificial intelligence with self-will, but long before that day there will be super-intelligent computers who want only and precisely exactly what they're told to want. You might be legitimately concerned about Dr. Frankenstein's monster running amok, but a more thoughtful examination of the facts would cause you to be much more concerned were you to imagine that it was Dr. Hannibal Lecter's monster performing tasks exactly as programmed.