The Doomsday Clock was set at 85 seconds to midnight on Jan. 27, the closest it has ever been to symbolic global catastrophe, according to the Bulletin of the Atomic Scientists.
The Bulletin said the decision reflects rising nuclear risk, accelerating climate change, biological threats and the growing use of artificial intelligence in security and military systems. The clock is reviewed annually by the Bulletin’s Science and Security Board in consultation with international experts.
The Doomsday Clock is a symbolic measure created in 1947 to represent how close humanity is to self-inflicted catastrophe. Midnight represents a global disaster.
Jack Rozdilsky, an associate professor of disaster and emergency management at York University, said the clock is meant to translate complex global risks into a signal the public can understand.
“It is a metaphorical indicator of the existential risks facing humanity,” Rozdilsky said. He said the clock is designed to “prompt a pause” and keep global threats visible without pushing the public toward fatalism.
He said the 2026 setting does not predict a specific outcome. Instead, he said it signals that political and policy decisions still matter.
The Bulletin said nuclear tensions remain a central concern, citing the ongoing war in Ukraine, conflicts in the Middle East and the erosion of arms control agreements between major powers.
The organization also pointed to climate change as a compounding risk, noting record global temperatures and the increasing frequency of extreme weather events worldwide.
The 2026 statement highlighted artificial intelligence as a growing factor in global instability, particularly when integrated into military decision-making systems.
Branka Marijan, a senior researcher with Project Ploughshares, a Canadian peace research institute based in Waterloo, Ont., said defence sectors face pressure to deploy new technologies before they are fully tested.
She said competition between states is creating what she described as a “race to the bottom,” where speed is prioritized over reliability.
Marijan said AI-supported decision systems can compress response times in crises, reducing opportunities for human judgment. She warned that this can increase the risk of miscalculation if commanders rely too heavily on automated recommendations.
She said researchers refer to this phenomenon as automation bias, a term used to describe situations where humans overtrust machine output, even when it may be flawed.
The Bulletin said biological threats were also part of its assessment, including future pandemics and the risks posed by emerging biotechnology.
Rozdilsky said public responses to existential risk can shift over time. He said earlier generations experienced similar warnings during periods of Cold War tension.
He said the challenge for institutions is communicating risk in a way that informs the public without creating disengagement.
Marijan said risk reduction depends on policy, not panic. She said states can lower the chance of escalation through sustained dialogue, confidence-building measures and transparency around emerging technologies.
She said those efforts can include information sharing, scenario modelling and cooperation through international forums, including United Nations-linked processes and informal diplomatic channels.
The Bulletin said the clock is not a prediction, but a warning. It said the measurement is intended to show how close the world is to catastrophe and how policy choices can still move it back.
