Atomic Pulse

Get to Know NTI: Douglas Shaw

Dr. Douglas Shaw, who joined NTI as a senior advisor to the president in 2018, is leading an NTI study of the risks and opportunities to nuclear security arising from the convergence of artificial intelligence and other emerging technologies. Before joining NTI, Shaw had a distinguished career spanning higher education, government, and NGOs, with a focus on arms control, nuclear policy, and various other policy areas. 

He spoke with NTI intern Jack Plummer about his current work and why he views NTI as “Grand Central Station for saving the world.” This interview has been edited for length and clarity. 

Jack Plummer: Thank you for meeting with me, Doug! You have a very diverse career background. I see you have worked at many different types of organizations, including universities and government and various NGOs. How did you end up working at NTI?

Doug Shaw: I’ve always been a huge fan of NTI since its founding. I was enthusiastically involved in the  Nunn-Lugar Cooperative Threat Reduction Program while working at the U.S. Department of Energy. I’m a big believer in the insights that formed it and its potential to contribute very substantially to the national defense and global security. So, I’ve always been a huge fan of NTI, and I was very fortunate to be in conversation with Joan Rohlfing as I was winding up my role as associate provost at George Washington University. That led to an opportunity for me to be here, and I’m absolutely thrilled to be part of the organization.

JP: What is your favorite part about working at NTI?

DS: Oh, my goodness! Well, it’s the people— absolutely the people. My favorite example of this is when the late David Hamburg, former president of Carnegie Corporation of New York, came into my office one day, and we were talking about his leadership at Stanford and about his thoughts on the creation of a technology accelerator in Silicon Valley. I had this sudden insight that this guy is playing checkers with my whole chess board as one piece! That kind of thing—seeing Charlie Curtis, Jeff Hughes, Lynn Rusten, Laura Holgate, and Corey Hinderstein, who are all giants in the field, and they’re all in one place. You hang out at NTI, and it’s kind of like Grand Central Station for saving the world. Eventually, everybody shows up.

JP: How did you end up studying these issues? Have you always been interested in technology?

DS:  I’m from a Strategic Air Command/NORAD family. I was born at Havre Air Station, which is near Malmstrom Air Force Base. My dad worked for Strategic Air Command and his dad worked for NORAD. Then I lived in Anchorage; nuclear war is very vivid there, too, because in about 1983 or 1984, FEMA issued a directive that mayors needed to have a plan for evacuating their city, and the radio announcer explained Anchorage would have only 3 to 27 minutes to evacuate 130,000 people with only two roads going out of town. That convinced me there needed to be one more Alaskan in the nuclear weapon decision making process. I felt like this wasn’t really working for me as an Alaskan teenager. And then the interest in emerging technologies has accreted over time as I’ve learned more and more.

JP: Recently, the topic of artificial intelligence has gotten a lot of attention in public discourse and the media. As an expert on the topic, are there any common myths or misconceptions that you think need to be addressed about A.I.?

DS: I think my biggest frustration right now is that we have The Terminator on the brain. For example, the National Security Commission on A.I. and the Defense Department have very rightly called out the need to separate A.I. from nuclear command and control decisions. That’s important, but it’s not the whole story. We are entering into a future that’s very different from the present. We’ve got leading technologists predicting what futurist Ray Kurzweil refers to as a “singularity,” defined by technological intelligence exceeding human intelligence and accelerating at an increasing rate from there, propelling us into a world that may be difficult for humans to even understand.  The questions facing us are not just about what A.I. may do on its own, but also what other humans will do with technologies that may exceed ready human comprehension. And so, the Terminator isn’t the only problem, particularly since we know that multiple governments maintain nuclear arsenals in large part to deter and threaten the United States.

I think there’s an urgent need to understand how A.I. upsets that fragile balance of terror, and there’s great work that’s starting to emerge. That’s one of the things I’m really trying to understand right now—how might the convergence of these technologies disrupt nuclear deterrence, and how we could adapt our nuclear security strategy for the information age?

JP: Tell me about the study you are leading for NTI.

DS:  Isabelle Williams, Patricia Jaworek, and others are working with me on a study to identify the risks and opportunities for nuclear security posed by the convergence of A.I. and other emerging technologies.  This study grew out of conversations with NTI’s Science and Technology Advisory Group and Board of Directors that suggested that an exclusive focus on A.I. safety measures – such as excluding A.I. from nuclear command and control and keeping A.I. from divulging “nuclear secrets” – might be ignoring novel changes driven by the combination of multiple technologies emerging from the private sector.

This work is still ongoing, but preliminarily we identified a changing technological environment in which the private sector is playing a potentially disruptive new role in nuclear security, new risks to nuclear security such as the prospect that data vulnerabilities may undermine confidence in the survivability of retaliatory nuclear forces, and opportunities such as the application of advanced technologies to cooperative threat reduction and confidence building.

JP: Some people may be hearing about these technologies such as A.I. or quantum computing for the first time. How do you talk about the risks posed by these advanced technologies in the context of nuclear weapons to an audience that might not have a lot of technical knowledge about them?

DS: I always start from the harm side. I’ll address the scale of destruction that’s involved. I worked for a while for Physicians for Social Responsibility, and they’re famous for doing this thing they call “bomb runs,” where they send a doctor into a community to give a lecture on what would happen if a nuclear weapon was detonated there. I find that to be a great entry point to talk about the human consequences of nuclear weapons use with people who may not be experts.

In high school, I learned from a Physicians for Social Responsibility report that a one megaton airburst one mile above Chicago would create more burn victims than we have burn beds in the entire United States, and I just couldn’t get my head around that at all. We talk about deployed arsenals in the 1,500 strategic nuclear delivery vehicle range as if that makes some sense, but the use of those tools would be beyond imagination. So, when I’m talking to somebody who may not be deeply invested in this space, I like to start with: “well, we have these very dangerous weapons and as technology gets more complex, we want to make sure that complexity doesn’t cause harm that wouldn’t otherwise arise.”

JP: What makes you hopeful for the future?

DS: We’ve done this before. After the Cuban Missile Crisis, some smart people got together and built an approach that allowed us to exercise human agency over this world of technology, and we succeeded. We didn’t succeed fully, but we moved the needle and that gives me hope. I mean, the Soviets weren’t nice guys, you know, that was a tough environment! But in the midst of deadly competition and the threat of total annihilation, we found there were things we could agree on, and then we could take action on that agreement.

Stay Informed

Sign up for our newsletter to get the latest on nuclear and biological threats.

Sign Up

More on Atomic Pulse


Why the World Must Reject New Nuclear Tests

Atomic Pulse

Why the World Must Reject New Nuclear Tests

Some former government officials are proposing that the United States resume explosive nuclear weapons testing. It is exactly the wrong response to today’s escalating nuclear threats.


“A Pivotal Juncture:” Ambassador Laura Holgate on the Future of Nuclear Security

Atomic Pulse

“A Pivotal Juncture:” Ambassador Laura Holgate on the Future of Nuclear Security

Ahead of the 2024 International Conference on Nuclear Security, U.S. Ambassador to the Vienna Office of the United Nations and the IAEA Laura Holgate joined NTI Co-Chair and CEO Ernest J. Moniz and Vice President for Nuclear Materials Security Scott Roecker for a conversation on “The Future of Nuclear Security.”


See All

Close

My Resources