This post was written by Anisa Lacey, an intern with NTI’s Development team. Lacey is a senior at Amherst College where she is majoring in Mathematics.
With the advent of nuclear weapons in the 20th century and subsequent innovations in technology and bioengineering, we have entered a world characterized by the increasing power of a few to institute irrevocable change — change that may threaten humanity. Globally, managing this power is often neglected and investments in decreasing catastrophic risks are historically under resourced.
Moral philosopher and Oxford research fellow Toby Ord, author of The Precipice: Existential Risk and the Future of Humanity, addressed this lack of investment at a recent NTI seminar hosted by NTI Co-Chair and CEO Ernest J. Moniz. In his opening statement, Ord set the stage for his talk with this stunning fact: “We spend less on securing our long-term potential than we do on ice cream.”
With 200,000 years of human history behind us, Ord estimates that this long-term potential could encompass a vast number of future generations and their achievements and innovations. However, with looming nuclear and biological threats, this future is not promised.
While humanity always has faced natural threats like earthquakes, volcanic eruptions, and disease, none of these events threaten our existence as gravely as modern, human creations. In fact, Ord asserts that even the combined risk of natural disasters is relatively low compared with the existential risks posed by nuclear, technological, and biological weapons.
In the spirit of past scholars like Carl Sagan, Ord shared that today, “We have the power to destroy our world, but not the wisdom to prevent it.” This disparity in humanity’s steadily rising power but stagnant forethought is outlined in The Precipice.
In it, he argues that ensuring humanity’s long-term potential through the mitigation of existential risk should be society’s top priority. He estimates that there is a one-sixth chance of human extinction within a century if we don’t move to reduce these threats.
On the power and potential or artificial intelligence (AI), Ord warned that scientists could create systems that surpass human intelligence and said he views unaligned AI as the highest existential risk we face. Ord insists that we maintain control of these systems lest they act on their own goals.
Regarding threats posed by nuclear weapons, Ord reviewed the history of close calls and estimated there is a 1/1,000 chance that nuclear war and the ensuing famine caused by nuclear winter could end humanity within the next century.
Lastly, the potential of bioengineered pathogens greatly increases the risk that humanity will face catastrophic, human-made pandemics.
Ord remarked that managing existential risk has received limited attention due to collective perceptions of immunity. “No one can imagine such a world-changing event happening on their watch,” he said. But, in response to the COVID-19 crisis, there is a “newfound willingness to act.” Ord says the world must maintain this heightened awareness to existential risks and strive to secure humanity’s future through dedicated institutional commitment. He suggests that this commitment begin with 1% of global GDP.
In response to a question, Ord maintained the importance of focusing on ethics from the social and institutional level, as well as the individual level. It is vital that “we become more patient, more prudent, wiser,” he said. “We are experiencing an unsustainable level of risk – either we destroy ourselves or we build institutions to manage it.”
Learn more about the seminar here or watch a recording of the event here.