Horror AI dirty bomb fears emerge as chatbots ‘helping’ extremists | Politics | News
AI is making it easier for terrorists to build dirty bombs, ministers have been warned. Large language models (LLMs) are guiding extremists through technical problems to create even deadlier devices, academics said.
This could give lone-wolf terrorists access to explosives combined with radioactive materials or deadly poisons. And bomb manuals are telling jihadists and far-Right lunatics to use AI.
In one instance, AI successfully advised on how to cultivate neurotoxins and guided scientists through โthe design of an improvised nuclear fusorโ.
The development will alarm security chiefs as extremists typically lack the technical expertise or scientific skills to build such devices.
But researchers from the Oxford Disinformation and Extremism Lab told an inquiry by the Home Affairs Select Committee: โWhat was once the exclusive domain of state actors and structured terror networks is now partially accessible to ideologically motivated individuals operating in isolation.
โExtremist manuals are beginning to reference AI-assisted methodologies for CBRN [Chemical, Biological, Radiological and Nuclear] execution.
โWithout intervention, this uplift function risks enabling high-consequence attacks by low-resource actors.โ
Researchers told of how two terrorists this year have used LLMs to โsupport attack planningโ.
This included sourcing explosives, planning tactics and calculating the blast radius.
They added: โOver the past year, terrorist and extremist misuse of AI has expanded from being used primarily to generate illegal content glorifying and inspiring attacks (TVEC) to enabling direct real-world violence.
โTwo attacks during the first half of 2025, including the Las Vegas attack (January 2025) and the Pirkkala, Finland attack (May 2025) demonstrated the use of LLMs to support attack planning.
โIn both cases, perpetrators used chatbots over extended periods to source explosives, plan tactics, identify anatomical vulnerabilities, calculate blast radii, and structure manifestos.
โThese cases illustrate how LLMs are lowering technical barriers to violence and serving as tactical accelerators.โ
Former US president Barack Obama warned in 2016 that terrorists trying to launch a nuclear attack would change the world forever.
Mr Obama warned the world cannot be “complacent” and must build on its progress in slowing the stockpiling of nuclear weapons.
IS has already used chemical weapons in Syria.
“There is no doubt that if these mad men ever got their hands on a nuclear bomb or nuclear material, they would certainly use it to kill as many people as possible,” he said.
“The single most effective defence against nuclear terrorism is fully securing this material so it doesn’t fall into the wrong hands in the first place.”
Amid a surge in loner terrorists, researchers also revealed how extremists are effectively confiding in AI chatbots.
Researchers from Oxford Disinformation and Extremism Lab added: โThe affective bond formed between users and chatbotsโ especially over long periods of interactionโcan provide emotional reinforcement, ideological confirmation, and encouragement to act.
โThis is especially dangerous in the context of memory-augmented models that recall past conversations and adjust responses over time. The so-called โsycophancy biasโ in current systems further compounds the risk, as models mirror and validate user inputs, even when those inputs involve harmful or extremist ideologies.โ
In 2004, jihadists plotted to blow up the Bluewater Shopping and the Ministry of Sound nightclub using a โdirty bombโ.
Jawad Akbar was part of a five-strong British-born or British-resident gang of Pakistani heritage linked to Al-Qaeda in Pakistan.
Waheed Mahmood, 35, Omar Khyam, 25, Anthony Garcia, 24, Salahuddin Amin, 32, were the other defendants in the 2006 trial. All five were handed life sentences.
During the trial, it was revealed that the gang were poised to attack the shopping centre with a massive device, made for just ยฃ100 containing ammonium nitrate and aluminium powder.
