Toby Ord

Toby Ord is an Australian-born moral philosopher at the University of Oxford, best known as the author of The Precipice: Existential Risk and the Future of Humanity (2020). His work sits at the intersection of existential-risk research, longtermism, and the practical application of effective-altruism principles to humanity’s most consequential challenges.

Key Contributions

The Precipice (2020)

Ord’s landmark book examines the full landscape of existential risk — from nuclear war and climate change to engineered pandemics and unaligned AI. Key contributions include:

  • A quantitative risk table estimating probabilities for different existential catastrophes, with AI assigned the highest single-risk probability (~10% chance of causing an existential catastrophe this century).
  • The Long Reflection — A proposed future period of careful deliberation about humanity’s values before making irreversible civilizational decisions.
  • Existential security — The positive counterpart to existential risk: a state where humanity has durably reduced existential risk to acceptable levels.
  • A comprehensive risk framework that has become standard in EA for thinking about and comparing existential threats.

The Precipice Revisited

In a later EA Forum post, Ord updated his assessment of the risk landscape in light of developments since the book’s publication. His key findings: AI risk has been reinforced as the dominant existential threat; rapid progress in large language models and scaling of capabilities have made his original AI risk chapter more relevant, not less; and the challenges of mobilizing action on speculative long-term risks remain formidable.

Ord’s updated assessment — that unaligned AI is the single largest existential risk facing humanity — has held up and strengthened through subsequent AI developments.

Role in the EA Community

Ord is a central figure in the effective-altruism movement. His work bridges abstract philosophical argument and concrete risk assessment, providing the empirical grounding that longtermist arguments require. His contributions to the ea-forum have helped shape the community’s evolving understanding of how to prioritize across cause areas in light of transformative AI.

Significance for This Wiki

Ord provides the empirical and philosophical bridge between general concern about existential risk and specific focus on AI as the dominant threat. The Precipice is one of the five canonical books in the EA/AI safety reading list (alongside works by nick-bostrom, will-macaskill, stuart-russell, and Brian Christian), and his ongoing updates demonstrate how thoughtful risk assessment evolves in response to new evidence — a model for how this wiki’s knowledge should be maintained.