AI and the Arms Industry: an Overview
This piece covers a use of AI that is usually not so obvious, or not so present in the public discourse: weapons.
Key points:
- European AI companies are increasing the number of partnerships with defence contractors. This follows a trend led by the US over the last years.
- Research and even deployment of automated weapons was already abundant before 2022.
- The current main uses of AI in conventional warfare involve autonomous drones, automated target selection, data analysis and area scanning.
It is out of the question that AI has revolutionised many aspects of our daily lives. From healthcare to finance or education, its impact is quite impossible to deny. In this piece, though, I want to talk about a use of AI that is usually not so obvious to most of us: weapons.
First of all, some data. Last month, during the AI Summit in Paris, it was announced that Mistral AI (usually referred to as the European counterpart of OpenAI and one of its main competitors) would partner with Helsing to create a new generation of defence systems. Helsing, in turn, is a German firm that specialises, among other things, in the production of long-range drones, and it has become one of the most valuable startups in Europe.
News like this came short after DeepSeek made an impressive apparition, showing that it was perfectly capable of competing with American companies at a fraction of the standard cost.
The partnership with Helsing is not the only one, though. Similar deals have been made between Mistral AI and AMIAD, a French agency created last year by the government with the mission of deploying AI defence solutions. Or, even outside Europe, between Mistral AI and Faculty AI, a London-based startup which develops AI-powered software that is not limited to, but also includes, the military. This last partnership shows how the UK is stepping up and willing to collaborate with the EU's defence efforts, and it will be worth tracking how the UK will come even closer to the EU due to its security concerns.
However, these deals are not exclusive to Europe, and it could be argued that they are just one of Europe's efforts to avoid lagging behind the USA and China, a common concern when it comes to Europe's competitiveness and innovation.
In December 2024, for instance, OpenAI announced a deal with Anduril, which produces AI-powered drones and radar systems, among other things. Palantir Technologies, another American company specialised in data analysis (which was involved in the Cambridge Analytica scandal), is participating alongside Anduril in Project Maven, a Pentagon-backed project that involves using machine learning to help identify military targets, remotely operate weapon systems, and so on.
This covers just a portion of AI's uses on the battlefield. Hybrid warfare and cyber-threats are as important as the development of more "conventional" weaponry, and I will cover them in a separate writing. For now, let's focus on some of the most paradigmatic cases of applied AI on the battlefield.
An obvious case consists of automated drones, which have been mentioned already. This is not new. Some might remember the civilian casualties due to an intensive campaign involving automated drones back in the Obama administration. Even though the technology was way less mature 10 years ago, the idea of having weapons that can autonomously select their target was already at play.
Drones are not the only weapons that can automatically select their target, though. An engineer of OpenAI claimed to have created an automated rifle that can follow voice commands. And this is by no means the only example. Robots that could be displayed on the battlefield are not just mere science fiction; companies like Boston Dynamics (which announced last February a collaboration with Meta, the holder of Facebook) are actively working on it; at least in non lethal models. No, we do not talk about a video that showed an impressive military drill carried out by a robot - it was a satire - but products like Spot, a quadruped robot capable of multiple uses, such as construction or manufacturing, are already on the market. Talking about Spot, the French military has already tested them for a while. And to give an idea of how long this kind of robot has been around, one of the predecessors of Spot, BigDog, was launched as soon as 2005.
All this shows that automated warfare has not been possible or thinkable only after 2022, when ChatGPT was commercialised, which made AI an absolute global trend. In 2022, an LLM-powered chatbot was commercialised and available to the general public for the first time in history. But this, by no means, marks the beginning of automated warfare, even if it marks a clear inflexion point, not only for the development of the technology but for all the attention - and hence the investment - it has attracted.
And has any of this been used so far? Despite the pledge of some companies not to get involved in weapons production (for example, Boston Dynamics and OpenAI pledged to stay away from it, and while it seems that Boston Dynamics has reassured on numerous occasions that its robots would not carry weapons, the possibility for another actor to take the lead is there), we have witnessed already the deployment of some of this kind of weaponry. An example of an intensive use of, again, automated drones is in the ongoing Ukraine war. In addition to this, the Ukranian army has also deployed another dog-like robot, this time the BAD.2 model, produced by UK firm Brit Alliance. It must be noted that these robots are not fully autonomous but are remotely operated. Its primary use has been to gather intelligence and to provide assistance, like carrying ammunition. Beyond this, Ukraine is ramping up its development of AI software to be integrated into different weapon, or navigation systems, that do not only enable an automated target selection, but also other functions like environmental scanning. Other uses of AI in the conflict so far include border control - something that is becoming more usual also among other countries - or satellite data analysis. Indeed, Ukraine has notably become a testing lab for AI military systems, and while fully automated warfare is still far from reality, one thing is clear: the AI race is also an arms race.
Other sources
- Ukraine’s Future Vision and Current Capabilities for Waging AI-Enabled Autonomous Warfare. Bondar, K. CSIS.
- The AI Machine Gun of the Future Is Already Here. Keller, J. WIRED.
- How AI Is Used In War Today. Marr, B. Forbes.
- Robot makers including Boston Dynamics pledge not to weaponize their creations. Vincent, J. The Verge.