In December, a special fighter jet made multiple flights out of Edwards Air Force Base in California. The orange, white and blue aircraft, which is based on an F-16, seats two. A fighter jet taking to the skies with one or two people on board is not remarkable, but it is indeed remarkable about those December flights that artificial intelligence flew the jet for periods of time.
As generational AI exploits like ChatGPT creep into the public consciousness, artificial intelligence has quietly slipped into the military fold—at least in these December tests.
The trips were part of a DARPA program called ACE, which stands for Air Combat Evolution. The AI algorithms came from a variety of sources, including a company called Shield AI as well as the Johns Hopkins Applied Physics Laboratory. In general, the tests show the Pentagon exploring how effectively AI can perform tasks on planes that are normally performed by humans, such as dog fighting.
“In total, ACE algorithms were flown on several flights and each sortie lasted about an hour and a half,” says Lt. Col. Ryan Hefron, DARPA program manager for ACE, notes to PopSci via email. “In addition to each team of performers controlling the aircraft during dogfight situations, portions of each sortie were dedicated to checking out the system.”
The flights came out of nowhere. In August 2020, DARPA advanced artificial intelligence algorithms in an event called the AlphaDogfight Trials. That competition didn’t involve any actual aircraft flying through the skies, but ended with an AI agent defeating a human flying a digital F-16. The late 2022 flights show that software agents capable of making decisions and flying real fighter jet dogfights have been given the chance. “This is the first time AI has controlled a fighter jet operating within a visual range (WVR) maneuver,” Hefron notes.
(Related: I flew in an F-16 with the Air Force and boy did it go bad)
So how did it go? “We didn’t have any major issues but we did encounter some differences compared to simulation-based results, which is to be expected when moving from virtual to live,” Hefron said in a DARPA press release.
Andrew Metrick, a fellow in the defense program at the Center for a New American Security, says that he is “often skeptical about the application of AI in the military field,” and that skepticism is focused on the practical use these systems will have. . But in this case – an artificial intelligence algorithm in the cockpit – he says he is more of a believer. “This is one of those areas where I think there’s really a lot of promise for AI systems,” he says.
The December flights are quite a step,” he says. “Integrating these things into a piece of flight hardware is not trivial. It’s one thing to do it in a synthetic environment – it’s another thing to do it on real objects.”
Not all of the flights were part of the DARPA program. All told, the Defense Department says a dozen sorties took place, some run by DARPA and others by a program out of the Air Force Research Laboratory (AFRL). The DOD notes that the DARPA tests were focused more on close air combat, while the other AFRL tests involved situations where the AI was competing with “simulated positions” in a “beyond line of sight” situation. In other words, both programs were investigating how the AI performed in different types of competition or aerial situations.
Breaking Defense reported earlier this year that the flights began on December 9. The jet that the AI flies is based on an F-16D, and is called VISTA; there is room for two. “The front seat pilot performed the test points,” Hefron explained via email, “while the back seat acted as a safety pilot who maintained broader situational awareness to ensure the safety of the aircraft and crew.”
One of the algorithms that flew the jet came from a company called Shield AI. In the AlphaDogfight trials of 2020, Heron Systems made the main AI agent, which was acquired by Shield AI in 2021. Shield CEO Ryan Tseng expects the promise of AI to overcome humans in the cockpit. “I don’t believe there is an air combat mission where AI pilots shouldn’t be decidedly better than their human counterparts, for a large part of the mission profile,” he says. That said, he notes that “I believe the best teams will be a mix of AI and humans.”
In one such human-AI teaming future drones powered by AI jets like the Ghost Bat could work with manned aircraft like the F-35, for example.
It is still early days for the technology. Metric, from the Center for New American Security, is thinking about how the AI agent would be able to deal with a situation where the jet does not respond as expected, for example if the aircraft stops or suffers some other kind of glitch. “Can the AI recover from that?” he wondered. A person might be able to handle such an “edge case” more easily than software.