Assessing and Managing the Benefits and Risks of Artificial Intelligence in Nuclear-Weapon Systems

Assessing and Managing the Benefits and Risks of Artificial Intelligence in Nuclear-Weapon Systems

Save to My Resources

Want to dive deeper?

Visit the Education Center

“At a time when most countries with nuclear weapons are modernizing or diversifying their nuclear arsenals, significant technological advances in artificial intelligence (AI) for military applications suggest that AI inevitably will be explored for use in nuclear-weapon systems. Along with significant benefits, however, come associated risks and implications for strategic stability.

Two application areas are considered the most likely to take advantage of AI advances in the near term to mid term: Nuclear Command, Control, and Communications (NC3) and autonomous nuclear-weapon systems. This paper envisions the specific functions AI could perform in these two areas and analyzes the potential positive and negative consequences.

In NC3, AI could be applied to enhance reliable communication and early warning systems, to supplement decision support, or to enable automated retaliatory launch. The implications vary dramatically. Enhancing communication reliability and decision-support tools with AI has recognizable benefits, is relatively low risk, and is likely stabilizing, although it still requires additional technical research to lower risk as well as deeper policy exploration of stability implications to avoid provoking an arms race. AI application to automated retaliatory launch, however, is highly risky and should be avoided.

For autonomous nuclear-weapon systems, AI along with sensors and other technologies are required for sophisticated capabilities, such as obstacle detection and maneuverability, automated target identification, and longer-range and loitering capability. Today’s technology and algorithms face challenges in reliably identifying objects, responding in real time, planning and controlling routes in the absence of GPS, and defending against cyberattacks. Given the lack of technology maturity, fully autonomous nuclear-weapon systems are highly risky. These risks, combined with the potential instability these weapons may cause, suggest that a ban on fully autonomous systems is warranted until the technology is better understood and proven.

For each state with nuclear weapons, the specific application and timing of AI incorporation will depend on the production or modernization schedule, the perceived benefits and needs, the technical capabilities and level of investment, and the level of risk acceptance. To encourage safe application and help minimize risks and negative effects on strategic stability as AI is introduced into nuclear-weapon systems over time, the following is recommended:

  • The U.S. national security enterprise should prioritize research on low technical risk approaches and fail-safe protocols for AI use in high-consequence applications. The research should be openly published as long as it does not jeopardize national security. Additionally, cooperative research with international partners should be considered, and other states with nuclear weapons should be encouraged to conduct research with the same purpose.
  • States with nuclear weapons should adopt policies and make declaratory statements about the role of human operators in nuclear-weapon systems and/or the prohibition or limits of AI use in their nuclear-weapon systems.
  • The international community should increase dialogue on the implications of AI use in nuclear-weapon systems, including how AI could affect strategic and crisis stability, and explore areas where international cooperation or development of international norms, standards, limitations, or bans could be beneficial.
  • In addition to an analysis and recommendations, this paper offers a summary of AI technology relevant to nuclear-weapon systems to provide background for those not already well versed in AI.”

Stay Informed

Sign up for our newsletter to get the latest on nuclear and biological threats.

Sign Up


My Resources