
The Army wants AI to detect hidden targets, like this M109A6 Paladin armored howitzer camouflaged during wargames in Germany.
AUSA: The Army has developed AI to spot hidden targets in reconnaissance photos and will field-test it in next year’s massive Defender 20 wargames in Europe, the head of the service’s Artificial Intelligence Task Force said here.

Brig. Gen. Matthew Easley, director of the Army’s Artificial Intelligence Task Force, at AUSA 2019.
It’s just one of a host of AI applications the Army is exploring with combat applications, Brig. Gen. Matt Easley said. Shooting down drones, aiming tank guns, coordinating resupply and maintenance, planning artillery barrages, stitching different sensor feeds together into a single coherent picture, analyzing how terrain blocks units’ fields of fire and warning commanders where there are blind spots in their defenses are all uses that will be tested.
The most high-profile example of AI on the battlefield to date, the controversial Project Maven, used machine learning algorithms to sift hours of full-motion video looking for suspected terrorists and insurgents. By contrast, Easley said, the new application looks for tanks and other targets of interest in a major-power war, he said, in keeping with the Pentagon’s increasing focus on Russia and China.
To date, the AI Task Force has mostly focused non-combat applications like mapping natural disasters in real-time for the National Guard, predicting Special Operations helicopter breakdowns before they happen, and improving the Army’s personnel system. But, Easley told an audience of contractors, soldiers, and journalists on the show floor of the Association of the US Army conference last week, “our final thrust this year is what we call intelligence support to operations, where we’re using offboard data” – that is, data gathered from a wide range of sensors, not just those onboard one aircraft or vehicle – “to look for peer competitor type targets.”
For example, Easley elaborated to reporters at a follow-on roundtable, you would feed the AI surveillance imagery of “a forested area” and ask it, “show me every tank you see in the image.” The AI would rapidly highlight the targets – far faster than a human imagery analyst could go through the same volume of raw intelligence – and pass them along to humans to take action.
Note that nobody’s talking about letting the computer pull the trigger to kill people. Although systems like the Army’s Patriot and the Navy’s Aegis have been allowed to automatically fire on unmanned targets like drones and missiles for decades, Pentagon policy and military culture make it hard – albeit not impossible – for humans to delegate the decision to use deadly force to a machine. And since AI can make absurd mistakes, and since an entire science of adversarial AI is emerging to find subtle ways to trick the algorithms into such mistakes, it’s probably best to have a human double-check before anybody starts shooting.
All those caveats aside, the target-detecting AI is already good enough that an Army acquisition program office has decided to try it, Easley said. It was the first such “transition” of a technology developed by the year-old task force into the Army’s formal procurement system. It will be “demonstrated” – Pentagonese for a relatively low-stakes test — during next year’s Defender 20 wargames involving 20,000 US troops and 17,000 NATO allies.
Created last fall, the task force is now nearly at full strength and can take on more projects, Easley said, such as a cybersecurity AI next year. Military networks have so many thousands of users and devices that just figuring out what is connected at any given time is a challenge, he said, let alone sifting through the vast amounts of network traffic to spot an ongoing intrusion. For all its faults, AI is much better than humans at quickly sorting through such masses of data.
AI can also quickly potential lines of sight between different points of the battlefield. That’s why the AI Task Force is working on a “fields of fire” AI that uses the new IVAS targeting goggles to determine what area each soldier can cover with their weapons. The software would compile that data for the whole squad, platoon, company, or even battalion, giving commanders a map of exactly what approaches were defended and where any potential blind spots lie. (Another potential application of this same technology would be to analyze potential fields of fire from suspected or confirmed enemy positions to identify the safest routes past them). Sydney J. Freedberg Jr.