No, that I can’t answer — it would depend entirely on the level of fallout and where it happens to land.
You would need to be able to perfectly, and I mean perfectly, predict weather months in advance in order to prepare accordingly.
The reaility is that for an AI, or rather an AGI, to make the choice to launch nukes would require them to reach a point where they accept the potential loss of their own ‘life’ in exchange for whatever value a nuclear war might hold. I struggle to believe that a ‘true’ AGI would make that choice. There are far too many variables to control in comparison to a biological agent, one that likely would not affect a machine.
Now, a modern AI making that choice? Absolutely possible, the things are fucking crazy with literally no concept of what life is.
An AI can easily start nuclear war, as can a human.
The only thing preventing a nuclear disaster are all the institutional measures limiting its accessiblity.
If you gave a single human (or a single AI) access to a magic no-strings-attached ‘Send a Nuke’ button, either the human/AI is the second coming of Jesus Christ, or a nuke will befall some unlucky portion of the population sooner or later. Bonus points if people can talk to the AI or if access to the button is hereditary.
No, that I can’t answer — it would depend entirely on the level of fallout and where it happens to land.
You would need to be able to perfectly, and I mean perfectly, predict weather months in advance in order to prepare accordingly.
The reaility is that for an AI, or rather an AGI, to make the choice to launch nukes would require them to reach a point where they accept the potential loss of their own ‘life’ in exchange for whatever value a nuclear war might hold. I struggle to believe that a ‘true’ AGI would make that choice. There are far too many variables to control in comparison to a biological agent, one that likely would not affect a machine.
Now, a modern AI making that choice? Absolutely possible, the things are fucking crazy with literally no concept of what life is.
An AI can easily start nuclear war, as can a human.
The only thing preventing a nuclear disaster are all the institutional measures limiting its accessiblity.
If you gave a single human (or a single AI) access to a magic no-strings-attached ‘Send a Nuke’ button, either the human/AI is the second coming of Jesus Christ, or a nuke will befall some unlucky portion of the population sooner or later. Bonus points if people can talk to the AI or if access to the button is hereditary.