Book review of 'If Anyone Builds It, Everyone Dies'—why AI doom isn't as visceral as nuclear war

In If Anyone Builds It, Everyone Dies, Eliezer Yudkowsky, founder of the Machine Intelligence Research Institute (MIRI), and Nate Soares, its president, argue that superintelligent AI will lead to humanity's extinction. In the same way that humans used their intelligence to dominate all other forms of life, so too will superintelligent AI surpass and dominate humans. As a dominant entity, AI will likely operate with an alien set of preferences and values, and humans won't be important to superintelligent AI's goals.
