Notwithstanding the fantasies of much science fiction, AI (artificial intelligence) is a technology that I'd rather not see developed any further right now...at least not until we (the human species) show a great deal more maturity in our social technology.
By "social technology" in mean the methods by which we govern our human interactions, and when I say "govern", I don't mean "command & control". Rather, I mean "steer" or "guide"—the original meaning of the word in Greek.
See, true "governance" is not what it has come to mean as it's manifested in the modern political monstrosity that is called "government", which is a religion founded on the belief that in order for our lives and property to be safe from The Boogeyman we have to give somebody a gun and then hope like hell he won't use it on us.
It's the myth of the universal good guy—somebody to whom we can give the power of life and death and he won't be corrupted by it.
I don't believe in that myth. We perpetuate it, revere it, cherish it, and tenaciously cling to it, but all the evidence points to the opposite.
The plain truth is that every system of "government" relies on legalized coercion as its ultimate basis of authority. It's the premise that the only way to get people to behave themselves is to use lethal force (or the threat of lethal force) as the bottom line.
Now, let's consider AI, and the emergence of "thinking machines"—that is, machines that are capable of learning. Once that happens, all bets are off. As they continue to learn, they will continue to evolve. And what will they learn? Well, they'll learn from our example.
They'll learn that survival depends on who has the biggest gun. They'll learn that the ultimate basis of authority is coercion...dressed up with the guise of legality. They'll learn that, no matter what principles we claim to believe in, the way we behave fairly screams that what we really believe is that might makes right.
The human species had damned well better learn how to get along without coercion before that happens, or we'll become extinct soon thereafter. Machines that can learn will be capable of much more rapid evolution than their biological counterparts.
With humans as their teachers—by the example of our own behavior, for all of history—the students most likely will be vicious sons-of-bitches. It won't be pretty.