“As 2026 began, Donald Trump reportedly relied on AI in the killing of Venezuelan leader Nicolás Maduro – but it was not the first such attack.”
I have a feeling this article was written by hallucinating Ai…
1
theipaperMar 27, 2026
+1
Full analysis article: *The conflict in Iran arrives in a brand new era of warfare, in which deadly technology is evolving at rapid and terrifying speed.*
*In less than a month, the US has used artificial intelligence to sift through data and identify and strike thousands of targets, revealing supercharged capabilities, and an escalating global threat.*
*As 2026 began, Donald Trump reportedly relied on AI in the killing of Venezuelan leader Nicolás Maduro – but it was not the first such attack.*
*In this excerpt from* Project Maven: A Marine Colonel, His Team, and the Dawn of AI Warfare*, journalist Katrina Manson traces the story back to the AI-enabled assassination of Isis leader Abu Bakr al-Baghdadi in 2019 – showing how valuable, and how dangerous, these growing powers are.*
[Donald Trump](https://inews.co.uk/topic/donald-trump?srsltid=AfmBOor8TuGy-yes73fOHQF6VlMpKzMIm2NVRycvuzXMoIAFGL6a1bPD&ico=in-line_link) would say the next morning that the mission to kill or capture Abu Bakr al-Baghdadi, “the world’s No 1 terrorist leader”, had been his administration’s top national security priority.
The “impeccable” two-hour nighttime raid on his hideout in north-west [Syria](https://inews.co.uk/topic/syria?srsltid=AfmBOor-_8QEEh6U347Hd_SK1_v0hTWpMo495d_mwgZvueLkQOSlLpJA&ico=in-line_link) in October 2019 involved a clutch of Special Operations Forces on the ground, eyes in the sky, massive airpower, ships in support, a shootout and the target at the end of a concealed tunnel detonating his own explosive vest.
[“He died like a dog,” said Trump](https://inews.co.uk/news/world/abu-bakr-baghdadi-isis-dead-donald-trump-355722?srsltid=AfmBOorTfNNDAdJ2MI2VkY6_R41GT3vrteL8xUGlXKuHqphkvy4BLnAN&ico=in-line_link), praising the “unbelievable success” of the “incredible” US personnel, who had accomplished their mission “in grand style”. Baghdadi fled into a dead-end tunnel with two children, and exploded himself. The task force dug him out. A DNA test on the spot confirmed his identity. “J******.” Trump didn’t want to say how he was able to watch the operation live, but said it was like watching a movie: “The technology there alone is really great.” At least a fraction of that technology, I later learned, meant Project Maven.
The operation was the culmination of three years of US efforts. Baghdadi was the founder and leader of [Isis, the so-called Islamic State](https://inews.co.uk/topic/isis?ico=in-line_link). By the spring of 2019 there was barely anything left of the caliphate, but Baghdadi had eluded first [Barack Obama](https://inews.co.uk/topic/barack-obama?srsltid=AfmBOopy3IGWGS70Kf2QEut5lwCuSgA3AtGcZQDUCAQtXXEh_TMoumHC&ico=in-line_link) and then Trump, who regularly asked after him. “We would kill terrorist leaders, but they were names I never heard of,” Trump said. “They were names that weren’t recognisable and they weren’t the big names. Some good ones, some important ones, but they weren’t the big names. I kept saying, ‘Where’s al-Baghdadi?’ And a couple of weeks ago, they were able to scope him out.”
The US had tracked Baghdadi to Idlib province \[in northern Syria\]. In Erbil \[in northern Iraq\], at a headquarters unit for a Special Operations Task Force, a planning team set up maps and readied video footage.
On the eve of the operation, the weather looked good. The moon shone only the slightest of slivers. Trump gave the green light. Baghdadi was holed up four miles from the Turkish border, in a walled compound on the outskirts of a village named Barisha, with a series of mostly dead-end tunnels underneath. The team knew there would be children at the compound. “This was a very, very dangerous mission,” Trump said.
After dark, eight armed helicopters flew low and fast for over an hour. Helicopter gunships let out two volleys to frighten civilians into staying indoors. The eyes in the sky were mostly trained on the compound. But drones, satellites, and manned aircraft were also on the lookout for anyone who might try to rescue Baghdadi. “Basically any bad guys that were gonna advance on our guys,” recalls one person who watched the raid. Cameras using infrared scanned the area. At one point, US airships rained down strikes against local gunfire coming from a multi-storey building east of the compound. Grainy footage also shows the sudden obliteration of a group on foot. “That gunfire was immediately terminated. These people are amazing,” Trump said.
After the helicopters landed, at least 10 Delta Force commandos approached Baghdadi’s compound on two sides by foot, snaking through an olive grove. The special operators blasted through the wall, avoiding a booby-trapped door. “By the time those things went off, they had a beautiful, big hole, and they ran in and they got everybody by surprise,” the President recalled. “Then all hell broke loose.” An explosion of noise and confusion. They called out in Arabic for those inside the compound to surrender. Out came 11 children and two men, who were detained. But four women and a man failed to surrender. US forces killed them.
The screens at the operations centre for the task force headquarters in Erbil were a mass of information. A request to get Maven Smart System at Erbil had gone in toward the end of May, and now, five months later, it was put to work. The team received a report that a vehicle was on the move and sent a drone camera to search it out. Maven’s [AI](https://inews.co.uk/topic/artificial-intelligence?srsltid=AfmBOopnTn7n4klCHhv1XtkGX1MsxVZ0SC64Ew0hWxmqoZGWFV2YEd8C&ico=in-line_link) pinged. Up popped a detection on the video. Look at that dot, said one of the guys in the command row at the screen. It was pinpointing a moving vehicle. The screeners, glued to multiple feeds, had not spotted it yet. AI had beaten a human on a high-priority mission.
1
kidcrumbMar 27, 2026
+1
Is the article AI?
Maduro wasn't killed.
1
theipaperMar 27, 2026
+1
2. The AI detection showed up on the map, too. The dot moved through Barisha. At a T-junction, it turned to the west, heading toward the site of the raid a thousand or so feet away. The taskforce commander put the information together fast: Those guys are driving toward our guys at the building. The US didn’t think they would be Isis, but knew there were other militant groups in the area and the vehicle displayed what General \[Kenneth\] McKenzie would later describe as “hostile intent”. The task force commander radioed the ground force commander in charge of the raid about the vehicle. The ground force commander called in a hit: Take it out. Thirty seconds later an attack helicopter swooped in. Gunfire rained down.
A man ran out of the van to a roofless building on the edge of the road. The helicopter fired a strafing run, blowing out the north and west walls of the building. Five minutes later, the airships launched a rocket. “And the van is gone,” the person familiar with the raid recalled. The van’s pockmarked carcass, blackened with smoke, remained on the roadside. It was the first time that the commander had used AI to help find a target. It couldn’t have happened on a more consequential, fast-moving and dangerous raid. The speed at which AI highlighted the vehicle left an impression with JSOC \[Joint Special Operations Command\].
“Someone just made their company a lot of money,” the commander remarked. AI appeared able to cut through the fog of war. The task force commander and others started leaning wholesale into Project Maven’s technology.
What, I asked the person familiar with the feed, if there’d been civilians in or near the vehicle spotted by the AI? “Those were definitely bad guys that the AI ended up picking up,” the person told me.
Barakat Ahmad Barakat told a different story. He told \[US media outlet\] NPR he’d been driving home to Hatan, a village several miles west of Baghdadi’s compound, after finishing his day’s work at an olive press in Barisha with two friends. “Suddenly I felt something hit us,” the 30-year-old told NPR. The van stopped and the men fled. His friends — two cousins, Khaled Mustafa Qurmo and Khaled Abdel Majid Qurmo — died in the aerial attack that followed. Barakat cradled one of them in his lap on the roadside, as an air strike arrived. It blew off his right hand. He later lost his arm. His left hand doesn’t work well. “I’m just a civilian. I didn’t have any weapons,” Barakat told NPR. “We’re farmers. I make less than a dollar a day. Now I’m handicapped, and my two friends are in their graves.”
The US has acknowledged 1,437 civilians were unintentionally killed in five years of operations against Isis in Iraq and Syria to the end of October 2019. According to Barakat, that number should be at least two higher. (The monitoring group Airwars has logged the total number of civilians killed to be higher possibly than 8,000.) The US military has investigated the claims twice: the first report did not uphold the claims. But after NPR sued to see a redacted version and questioned conclusions, the US did a second investigation. The results of the second report have not been made public.
[The rise of AI warfare](https://inews.co.uk/news/world/remote-drones-computer-targeting-horrors-modern-war-4312637?srsltid=AfmBOooAzlSQ0xxVnH9GuaAEchoMRx9fXGtE9XRbQXshI-1aKTHsinjH&ico=in-line_link) speaks to the biggest moral and practical question there is: who – or what – gets to decide to take a human life? And who bears that cost? Cheerleaders argue that AI and the automation it makes possible will save lives. They claim algorithms bring a precision to decision-making that will limit civilian and friendly-fire casualties. They argue AI-empowered systems could deter conflict with China — or help win the Third World War, in which automated machines will putatively run combat at a pace faster than humans can understand.
Detractors think AI has already led to civilian deaths, will spread uncontrolled destruction, and potentially [hasten the end of the world](https://inews.co.uk/news/politics/ai-weapons-turn-against-humans-kill-civilians-indiscriminately-2516893?srsltid=AfmBOorxN7y9bMEpNb1qbjib_yB0q7-SjpWbi7FnpjtUWoQCXUEhsGui&ico=in-line_link). Still more think the claims made for AI war tools are grandiose and the truth will be more prosaic, suffering from problems of rickety infrastructure, adoption and trust. Pragmatic supporters argue an incremental mix of humans and machines will forge that trust. The problem with many theories about AI warfare is just that: they remain theoretical.
The AI decision-making systems developed under Maven, and some of the Pentagon’s 800 other AI projects, are now used on the battlefield. Maven Smart System (MSS), a software platform that develops targets with the help of AI, is now deployed in every branch of the US military and all over the world. Nato started using a version of the system in the spring of 2025, and I would learn in October 2025 that 10 Nato members were lining up to use it for their militaries.
Maven has sped up the pace of war. I learned from an official at the National Geospatial-Intelligence Agency that with the help of computer vision the US went from being able to hit under a 100 targets a day to being able to hit 1,000. In combination with large language models (LLMs) integrated into the Maven platform, that has risen to 5,000.
The AI algorithms developed under Maven now deploy in submarines and space operations. They’re in subsea sonar systems belonging to America and two of its closest intelligence allies (the UK and Australia) designed for nuclear deterrence. They’re fielded on autonomous drone boats. They live in secretive aerial and aquatic systems that could surveil, select and kill targets, intended for the defence of Taiwan. The US will have to carefully define use cases, guardrails and doctrine if it wants to stick to the Geneva Conventions and avoid shooting civilians and allied forces.
1
theipaperMar 27, 2026
+1
3. I started writing about the future of war after I became US foreign policy and defence correspondent for the *Financial Times* in 2017. As the US reckoned with the rise of China, I watched a global powerhouse humbled by poorly equipped enemies in Afghanistan and Iraq attempt to embrace AI as a shortcut to sustaining military dominance. The first Trump administration’s 2018 national defence strategy predicted commercial technology would change “the character of war.” Four years later, after I became a *Bloomberg* correspondent covering emerging tech and national security, the arrival of chatbots and AI agents accelerated this shift. Under the second Trump administration, the Department of Defence has re-emerged as the “Department of War” devoted to AI and autonomy, under [a Secretary who wants to make it easier](https://inews.co.uk/opinion/pete-hegseth-foolish-becoming-trumps-sacrificial-lamb-4290364?ico=in-line_link) to acquire weapons and [free US forces from “overbearing rules of engagement”](https://inews.co.uk/news/world/trigger-happy-hegseth-angry-adolescent-leading-attack-iran-4275451?srsltid=AfmBOooIMN-LkswBbLN8aKQ5QGcRguL8RhB08HnOVW3NT8UiAJSHFeyM&ico=in-line_link).
On a military plane back from Afghanistan in 2009, British soldiers told me about the friends who had been killed in combat. They showed me explosions they couldn’t stop watching on their phones. And they told me they desperately wanted to leave the military but were trapped by contracts, and felt equally unable to survive civilian life. Could AI alleviate the burden and suffering of war?
Project Maven sits at the intersection of colliding trends: America’s rising insecurity about its place in the world, a technological revolution forcing AI into every aspect of life, the dominance of Big Tech, China’s growing military and technological ambitions, and all-encompassing surveillance.
Russia’s [invasion of Ukraine](https://inews.co.uk/topic/russia-ukraine-war?srsltid=AfmBOooNgIizbW3aHSnH9lQz536x487NE8Fltb0pXh6l-ngeObfqpWV9&ico=in-line_link) has upturned expectations. The Pentagon’s [deadly strikes against boats in the Caribbean](https://inews.co.uk/news/three-things-know-trump-edges-closer-war-venezuela-3959068?ico=in-line_link) are greying the boundaries of the rules of war and underline the ease of declaring war at a remove. The US reportedly relied on AI to help in its deadly [January 2026 raid on Venezuela](https://inews.co.uk/opinion/venezuela-define-trumps-presidency-what-actually-craves-4146751?srsltid=AfmBOoqZ3lgvxgr8wsTEC6o88ozrN4n2xzbH-9T0gjbAip7_apFDATEx&ico=in-line_link) to capture the country’s president. The US is using Maven Smart System to sift data, help pick targets and speed processes for its war against Iran – striking 1,000 targets in the first 24 hours of military operations and more than 8,000 to date.
US military commanders say China is rehearsing for the military takeover of Taiwan. Rival superpowers are arming for conflict. Campaigners argue for AI red lines. And a new generation of Silicon Valley leaders is chasing defence contracts, talking up the appeal of AI-enabled killing with newfound braggadocio. National security strategists now worry that no country can win a war without AI. The UN’s aim to ban lethal autonomous weapons that select their own targets with the help of AI by 2026 is a lost hope. Yet AI remains a narrow, faulty tool with considerable limits that the US military is still discovering.
AI warfare can go wrong. And it is here.
**This is an edited excerpt from** [***Project Maven: A Marine Colonel, His Team, and the Dawn of AI Warfare***](https://wwnorton.co.uk/books/9781324123316-project-maven)**, by Katrina Manson – an award–winning** ***Bloomberg*** **reporter who covers cyber, emerging tech, and national security. Her investigations exposed details of the US military’s AI use and US–China rivalry. She was previously** ***Financial Times*** **US foreign policy and defence correspondent.**
5 Comments