Hide table of contents

Andy Weber was the U.S. Assistant Secretary of Defense for Nuclear, Chemical & Biological Defense Programs from 2009 to 2014. He’s now a senior fellow at the Council on Strategic Risks. You might also know him from his appearance on the 80,000 Hours PodcastAsk him anything![1]

He’ll try to answer some questions on Friday, September 29 (afternoon, Eastern Time), and might get to some earlier. 

I (Lizka) am particularly excited that Andy can share his experience in nuclear (and other kinds of) threat reduction given that it is Petrov Day today.

Instructions and practical notes: 

  • Please post your questions as comments on this post. 
  • Posting questions earlier is better than later. 
  • If you have multiple questions, it might be better to post them separately. 
  • Feel free to upvote questions that others have posted, as it might help prioritize questions later. 

Other context and topics that might be especially interesting to talk about:

For those who want to explore more: The Dead Hand by David Hoffman might be interesting; Project Sapphire and some of the work against biological threats are captured in it.

  1. ^

    He might not get to some questions, or be unable to answer some. 

Comments49
Sorted by Click to highlight new comments since:

Do you think research into game theory has increased, decreased, or had no effect on the risks from nuclear weapons? Should this tell us anything about the value of research into the theoretical basis of conflict in the future?

It is one somewhat useful tool to try to assess and potentially mitigate risks. With nuclear weapons the data on use and near misses are extremely limited. Untested theories can be helpful, but we can’t rely on them too much because the stakes of getting it wrong are so high.

If, in the next 15 years, there is a human caused biological global catastrophe (say, kills >1% of global population), what credence would you give that artificial intelligence was somehow involved?

AI is increasing the BW threat in at least two ways. It is expanding “recipe” access to more players, like the internet did. For the last thirty years there were terrorist groups with intent to deploy BW, but they were either interdicted or not very capable. AI will expand access to capability. The second concern is that sophisticated actors will use AI-enabled bioengineering to make enhanced pathogens. I’m pleased that responsible AI companies are working feverishly to put in place guardrails to mitigate both of these risks. To answer your specific question, I would not be at all surprised if in 15 years a global biological catastrophe is AI-enabled.

ASB
16
2
0
2

Because you've been a public servant who took on the responsibility of shutting down the Soviet bioweapons program, securing loose nuclear material, and kickstarting a wildly successful early career program while at the DoD, I need to know: is it ever difficult being so awesome?

And, what would your advice be for younger folks aiming to follow in your footsteps?

ASB, Maybe I’ll try answering both questions at once, because the first one is too ridiculous to answer directly without being snotty. While some of my work during a thirty year public service career has received attention, it is very important to understand that anything we achieved was the result of teamwork. Indeed, one of the most satisfying aspects of public service is that it is a team sport. So my advice is don’t have personal ambition, but rather focus on mission ambition. Together we can accomplish big things to make the world a safer place. One of the aspects I love about the EA community is how supportive and kind people are to one another.

Should the US start mass-producing hazmat suits? So that, in the event of an engineered pandemic, the spread of the disease can be prevented, while still being able to maintain critical infrastructure/delivery of basic necessities.

Physical protection works, so this would be our best defense until medical countermeasures are developed and distributed. We need better and cheaper masks and suits, and they should be widely available in a crisis.

How seriously are national security people taking the threat of AI today (in particular, extinction risk)? Can we expect meaningful action to create a kill switch soon?

Lately it is quite high on the national security agenda. The upcoming UK summit demonstrates the importance some leaders attach to it. A lot of this resides in the private sector, so governments will have to work in close partnership with private stakeholders to take meaningful action. I don’t know enough to answer your specific “kill switch” question.

Very happy to see you doing this, and hope you're doing well.

Question: What is your view on catastrophic risks from communicable versus noncommunicable biological or chemical threats - for example, biotoxins, anthrax, or chemical weapons, as opposed to possible communicable disease bioweapons like smallpox? Specifically, do you see a justification for considering [noncommunicable threats] global catastrophic risks? [Edited to clarify.]

(I'm interested in hearing your views on lots of topics, but since I'm not going to ask fifty question here, I wanted to pick something I think you may disagree with the "EA consensus" about.)

David, Great to hear from you, and I look forward to your other forty nine questions the next time we meet in person. Communicable biological weapons represent an existential or omnicidal risk. Non-communicable biological weapons could also be catastrophic. In my opinion several million dead is catastrophic, even if it is not existential. Thankfully, much that we can do to prevent the worst case will also reduce the lesser included case. Also, some non-communicable BW agents like antibiotic and vaccine resistant anthrax are more probable. So if there is an “EA consensus” to ignore toxins and anthrax, I would disagree.

"Specifically, do you see a justification for considering the former global catastrophic risks?"

Which is the "former", communicable or non-communicable?

Thanks, fixed!

Do you think that raising life scientists' awareness about the potential dual-use risks of their work is net-positive, because they can mitigate those risks, or net-negative, because they will draw the attention of bad actors?

Definitely net-positive. It is actually shocking how little on this is included in the education of life scientists. We teach bio-ethics but rarely biosecurity.

Are there any misconceptions, stereotypes, or tropes that you commonly see in academic literature around nuclear security or biosecurity that you could correct given your perspective inside government?

Some of our amazing former Council on Strategic Risks Ending Bioweapons Fellows wrote this outstanding paper debunking common misconceptions about biological weapons: https://councilonstrategicrisks.org/wp-content/uploads/2020/12/Common-Misconceptions-About-Biological-Weapons_BRIEFER-12_2020_12_7.pdf

Here's a summary of the report from Claude-1 if someone's looking for an 'abstract':

There are several common misconceptions about biological weapons that contribute to underestimating the threat they pose. These include seeing them as strategically irrational, not tactically useful, and too risky for countries to pursue.

In reality, biological weapons have served strategic goals for countries in the past like deterrence and intimidation. Their use could also provide tactical advantages in conflicts.

Countries have historically taken on substantial risks in pursuing risky weapons programs when they believe the strategic benefits outweigh the costs. Accidents and blowback would not necessarily deter programs.

Decisions around biological weapons activities are not always top-down and known to all national leaders. Bureaucratic and individual interests can influence programs apart from formal policy.

International norms and laws alone are insufficient to deter or discover clandestine biological weapons work given lack of verification. COVID has shown existing vulnerabilities.

Dispelling these misconceptions is important for strengthening defenses against the real biological weapons threat, which pandemic has shown remains serious despite decades of effort. More investment is needed. 

What are you more concerned about in the biological weapons space: states, terrorist groups, or lone wolves? Why (if you can share the information)?

I’m deeply concerned about all of these. Thankfully, only a handful of countries are actively pursuing biological weapons. That said, the few countries that have offensive BW programs are very dangerous. Given the expanding access to knowledge and BW capabilities, I worry also worry a lot about terrorist groups and lone wolves. They represent a very difficult intelligence and law enforcement challenge.

After what happened in Iraq, do you think the USA is likely to take unilateral action against nuclear/chemical/biological threats if they emerge in the near future? If not then what might their approach be too such threats?

Unilateral action should be the last resort. The Iraqi BW program was successfully destroyed by 1996, and it was never reconstituted. Rolf Ekeus wrote a very good book on UNSCOM’s successful efforts. The only place today that such unilateral action would even be considered is Syria’s rump chemical weapons program. The only other countries that have biological and chemical weapons also have nuclear weapons, so unilateral action would not be considered unless it were part of a larger direct conflict.
Our approach should be to do everything we can to strengthen the norm against developing and using biological and chemical weapons. Thankfully very few countries pursue these prohibited weapons. I am also a strong believer in deterrence by denial, and the Council on Strategic Risks has written about this. We and our Allies and partners should have a visible, greatly expanded biodefense effort to deter bio attacks and deny our adversaries the mass casualty effects of such weapons. The U.S. Department of Defense spends less than 1/5 of one percent of its budget on chemical and biological defense. This needs to change. The recent Biodefense Posture Review is a good step in the right direction.

The only other countries that have biological and chemical weapons also have nuclear weapons, so unilateral action would not be considered unless it were part of a larger direct conflict.

This seems like a strong incentive for countries like Syria to try to acquire nuclear weapons.

Syria tried and failed to develop nuclear weapons. In 2007 the Israeli Defense Forces destroyed a nascent nuclear reactor under construction. Any renewed attempt would face a similar fate.

The prospect of a nuclear conflict is so terrifying I sometimes think we should be willing to pay almost any price to prevent such a possibility. 

But when I think of withdrawing support for Ukraine or Taiwan to reduce the likelihood of nuclear war, that doesn't seem right either -- as it'd signal that we could be threatened into any concession if nuclear threats were sufficiently credible.

How would you suggest policymakers navigate such terrible tradeoffs?

That’s part of the job - there are few easy policy decisions. I would give NATO and the Biden Administration high marks for lowering the risk of nuclear war AND supporting Ukraine and Taiwan. When Putin and his minions were making reckless and dangerous nuclear threats, we were calm and did not change our nuclear posture. This approach seems to be working.

How much do you think the risk of nuclear war would increase over the century if Iran acquired nuclear weapons? And what measures, if any, do you think are appropriate to attempt to prevent this or other examples of nuclear proliferation?

Joel, I’m highly confident Iran will not acquire a nuclear weapon. The U.S. and Israel have exquisite intelligence on the Iranian nuclear program, which has been a high priority for decades. Should Supreme Leader Ali Khamenei change his policy and pursue a nuclear weapon, we would know.

During my time in government I was involved in convincing the Israeli government not to launch a military strike against Iranian nuclear facilities. I was also involved in developing the military capabilities needed if Iran did opt for nuclear weapons, and these capabilities are now mature. The Iranian leadership understands well that pursuit of nuclear weapons would provoke an Israeli military strike.

The above said, Trump’s decision to pull the U.S. out of the JCPOA to pursue a better deal was a stupid and reckless failure. Undoing that damage should be a high priority for our diplomats.

Do you see a realistic prospect for any state with nuclear weapons giving them up? 

Either unilaterally or as part of some kind of agreement?

There are a few examples of states giving up nuclear weapons or the pursuit of them. South Africa gave them up. Sweden, Argentina, and Brazil gave up seeking them. I do believe that we can make serious steps towards the vision of a world without nuclear weapons. It will take determination, energy, and creativity to get us there. That is one reason I am so excited that so many young leaders have shown a renewed interest in nuclear weapons issues.

This answer seems very diplomatically phrased, and also compatible with many different probabilities for a question like: "in the next 10 years, will any nuclear capable states (wikipedia list to save some people a search) cease to be so"

Hindsight is 20:20, but do you think it was a net good for Ukraine to give up its nukes? I know it didn't have the C2 capabilities to actually use them at the time and economically was kind of strongarmed into it and all else equal I know it's better if fewer countries have them, but maybe it would have prevented this current war which has significant escalation potential.

Removing nuclear weapons from Kazakhstan, Belarus, and Ukraine was an extremely important success. Had Ukraine tried to retain nuclear weapons, I believe an armed conflict with Russia would have broken out in the 1990’s.
There are many other things that could have been done to prevent Russia’s unprovoked, illegal attack on Ukraine. Ukraine keeping nuclear weapons is not one of them.

Had Ukraine tried to retain nuclear weapons, I believe an armed conflict with Russia would have broken out in the 1990’s.

Could you explain your reasoning here? My impression was that nukes generally make war less likely (through deterrence) but more costly if it occurs.

Every specific case is different. I’m not sure that we can generalize and state that nuclear weapons make war less likely. That said, I am a supporter of stable deterrence for the U.S. and its Allies. In my view so-called tactical nuclear weapons weaken deterrence. Do we really want Putin to think that use of a “small” nuclear weapon would be met with a proportionate nuclear response?

"There are many other things that could have been done to prevent Russia’s unprovoked, illegal attack on Ukraine. Ukraine keeping nuclear weapons is not one of them."

  • Could you explain your thinking more for those not familiar with the military strategy involved? What about having nuclear weapons makes an invasion more viable? Which specific alternatives would be more useful in preventing the attacks and why?

See my comments above on Iran. A tougher response to Putin’s attack on Georgia in 2008 and the illegal occupation of Crimea and Eastern Ukraine in 2014 might have prevented Putin’s terrible decision to invade in 2022. We could have provide more military assistance and training to Ukraine after 2014. Perhaps we should have been more receptive to Ukraine and Georgia’s NATO membership aspirations in the late 1990’s and early 2000’s.

What role should international organizations and treaties play in regulating emerging biotechnologies to prevent their misuse for bioweapons development?

How can we strike a balance between scientific research and security concerns in the field of biotechnology to prevent the accidental or deliberate creation of bioweapons?

  1. International organizations and treaties have a vital role in preventing BioWeapons development. We need to redouble our efforts to strengthen the BWC. There also needs to be stronger global governance to prevent accidents and misuse. Kazakhstan President Tokayev has proposed establishing and International Biosafety Agency. This and other similar concepts to strengthen biosecurity should be actively promoted.

  2. We definitely need to do more on the security side of this equation.

Context: I'm hoping to learn lessons in nuclear security that are transferable to AI safety and biosecurity. 

Question: Would you have any case studies or advice to share on how regulatory capture and lobbying was mitigated in US nuclear security regulations and enforcement?

Do you have any recommendations for doing threat modeling well? In particular, resources that seem applicable to many different risks - nuclear, AI, bio.

Do we need more people at the big AI companies to be like Petrov? (e.g. disobey orders to commence/continue a dangerous training run).

What metrics should be used to evaluate the success of arms control agreements in effectively monitoring and limiting the deployment of tactical nuclear weapons?

In what ways do regional security dynamics influence the decisions of nations to acquire and deploy tactical nuclear weapons, and how can these dynamics be managed to reduce the risk of conflict?

The Council on Strategic Risks just released a report on tactical nuclear weapons: https://councilonstrategicrisks.org/2023/08/01/ending-tactical-nuclear-weapons/

The INF Treaty and the 1991 Presidential Nuclear Initiatives were two historic examples of eliminating so-called tactical nuclear weapons. We need to build on these lessons and make capping and eliminating tactical nuclear weapons the highest arms control priority.

Could you share the top 3 constraints and benefits you had in improving global nuclear security while you were working for the US DoD compared to now, when you're working as an academic?  

I believe significant changes to U.S. nuclear weapons policy and posture only occur when the President personally intervenes. This was also true of the U.S. decision to eliminate its biological weapons program in 1969. President Nixon demanded it.

Does the development of regional groupings like the EU improve X-risk mitigation, or exacerbate risks?

If they help mitigate risks, how could development of such groupings become aligned with US interests?

Context: I'm hoping to find lessons from nuclear security that are transferable to the security of bioweapons and transformative AI. 

Question: Are there specific reports you could recommend on prevening these nuclear security risks:

  • Insider threats (including corporate/foreign espionage)
  • Cyberattacks
  • Arms races
  • Illicit / black market proliferation
  • Fog of war
More from Lizka
Curated and popular this week
Relevant opportunities