On Sunday afternoon, I went to the National Football Conference championship game at Lincoln Financial Field in Philadelphia. At one point early in the fourth quarter, the Philadelphia Eagles were on the 1-yard line of the opposing Washington Commanders, ready to run their unstoppable Brotherly Shove play for a score. (Trust me — this has something to do with Future Perfect.)
Future Perfect
Explore the big, complicated problems the world faces and the most efficient ways to solve them. Sent twice a week.
Knowing they would almost certainly give up a touchdown, the outmatched Commanders decided to do something a little different. First, a Commanders defender purposefully jumped over the line early, leading to a penalty for encroachment. Then they did it again — same thing. And again — same thing. They seemingly had every intention to keep jumping the line, over and over. And each time, the referees moved the ball half the distance to the goal line, as happens when defensive penalties occur close to the end zone.
Anyone familiar with the principle of infinite divisibility in geometry can see the problem here. A line segment — like the distance here between the line of scrimmage and the end zone — can be infinitely divided, over and over. Which means that theoretically, the Commanders could have kept encroaching, and the Eagles could have kept advancing half the distance to the goal line without ever getting there, until the end of time.
Fortunately for the players, coaches and nearly 70,000 fans in attendance, the referees found a way out of this particular paradox by invoking a little-known NFL rule that allows the offense team to be automatically awarded a touchdown if the defense keeps purposefully committing penalties to stop them. That was finally enough to get the Commanders to cut it out.
All of which brings us to a subject we’ve written about a few times here at Future Perfect: the Doomsday Clock. (See, I told you we’d get there.)
Created and run by the Bulletin of the Atomic Scientists, which itself was founded by many former Manhattan Project physicists who had become alarmed by the threat of nuclear weapons, the Doomsday Clock is meant to be a symbolic representation of how close humanity is to existential destruction. Each year, a group of experts in everything from nuclear science to climate change to cybersecurity sets the hands of the clock. The closer it is to midnight, the closer humanity supposedly is to extinction.
In 2023, the Bulletin made some news when it moved the hands of the clock up 10 seconds, to 90 seconds until midnight — the closest it had ever been since it was launched in 1947. While that meant humanity was supposedly closer to annihilation than it had been in such famously dangerous times as 1964 (not long after the Cuban Missile Crisis, when it was set to 12 minutes to midnight) or 1984 (shortly after one of the closest nuclear calls in Cold War history, when it was 3 minutes to midnight), this was the first setting after Russia invaded Ukraine and raised nuclear fears to a height they hadn’t reached in decades.
Last year, citing everything from Ukraine to Gaza to climate change to growth in AI, the board kept the clock at 90 seconds to midnight.
And then yesterday morning, the board revealed the clock’s new setting. Set your doomsday time to… drumroll please: 89 seconds to midnight, one second closer.
The board listed a slew of factors: continued nuclear risk around Ukraine and the disintegration of nuclear arms control; the growing impacts of climate change after what is likely the hottest year on record; the threat of new diseases like bird flu; AI progress, and especially, potential military applications; and disinformation and cyber insecurity.
If those sound familiar, well, they’re pretty much the same factors as the year before, and the year before that, something board chair Daniel Holz acknowledged at Tuesday’s event, saying these factors “were not new in 2024. But we have seen insufficient progress in addressing the key challenges, and in many cases this is leading to increasingly negative and worrisome effects.”
Still, everything about Tuesday’s announcement underscored an essential problem with the Doomsday Clock. It’s running out of time — perhaps metaphorically, as it’s meant to, in the case of humanity’s survival, but quite literally, in the sense that a clock only has so many hours, minutes, and seconds.
And that problem is something the entire field of existential risk suffers from. Just like those referees in Philadelphia, there are only so many times you can issue a warning before it starts to feel meaningless, especially as we seem to get closer and closer to annihilation without, quite, getting there.
In a way, the Doomsday Clock is a victim of its own success as an unparalleled symbol of 20th-century, Cold War nuclear fear. So compelling was the idea of the hands of a clock, inching toward the midnight moment when the missiles would launch, that the classic 1980s Watchmen graphic novel used it as an unforgettable central motif.
Like James Bond movies and Rambo films, though, the Doomsday Clock suffered after the end of the Cold War and the apparent removal of its reason for being: nuclear war. With that threat seemingly behind us, the clock branched out into new threats like climate change and infectious disease, and later very 2010-era worries like disinformation and democratic backsliding.
Sign up here to explore the big, complicated problems the world faces and the most efficient ways to solve them. Sent twice a week.
The problem, as we’ve written before, is that non-nuclear existential risks simply don’t fit well into the metaphor of a clock. A nuclear war is largely a binary risk — the missiles fire, and the clock strikes 12, or they don’t. And there’s an entire field of geopolitics and diplomacy dedicated to gauging just where the world is on nuclear risk. It is about as measurable and knowable as existential risks get, which is why the Doomsday Clock was so iconic.
But other, newer existential risks don’t work that way — assuming they even are, indeed, true existential risks. Climate change is not a binary but a cumulative, ongoing risk, less sudden fatal heart attack than lifelong case of planetary diabetes. If climate risk were a clock, it’d be hard to know what time it is, or even whether the clock would ever truly strike midnight.
Other risks are even more difficult to track. Artificial intelligence just experienced one of the most eventful weeks in its young history, as China’s DeepSeek showed that advanced models might be cheaper and harder to stifle than the industry had thought, even as America’s big AI players lined up for an unprecedented $500 billion buildup. Is AI even an existential risk? Maybe — though no one can tell you with any certainty how precisely it might unfold, or how close we really are. And AI, unlike nuclear weapons, has benefits for science and society we can’t just put aside.
When it comes to infectious disease, as worrying as the recent outbreaks of bird flu have been, we have no certainty that this will indeed be the next pandemic — or how severe it would be should that happen. A new virus will come for us, but chances are we’ll be surprised by what it is, just as we were surprised by Covid. And the odds that such a virus would actually threaten us with extinction seem very low.
We live in a world that is right now awash in fear, even if those fears are often overstated and out of step with reality. I worry that as the Doomsday Clock waters down its original focus on nuclear war — something that really is getting worse — and makes these minute changes year to year, it will end up burning out the very audience it is meant to galvanize. You can only say the world is close to ending so many times, only elevate so many risks to the status of existential ones, before people begin to tune you out.
A postscript to that story about the Eagles game: Once the referees had made their final warning, the Eagles were able to run their Brotherly Shove, and push quarterback Jalen Hurts into the end zone for a touchdown, en route to a dominating 55-23 victory. (Go Birds!) You can have all the warnings in the world — but that doesn’t mean you can stop the inevitable from happening.