English · العربية · فارسی · עברית · Русский · 中文 · Español · Français

Will AI Replace Fighter Pilots? DARPA ACE, CCA & the Autonomous Air Combat Race

Guide 2026-03-21 14 min read
TL;DR

DARPA's ACE program proved in 2023-2024 that AI can match human fighter pilots in dogfights. The U.S. Air Force is now acquiring 1,000+ Collaborative Combat Aircraft — $25 million AI wingmen that fly alongside manned jets. Meanwhile, Iran fields thousands of cheap autonomous Shahed drones, creating a two-track autonomous air combat revolution that is reshaping the current conflict.

Definition

AI-piloted combat aircraft are unmanned or optionally-manned military planes controlled by artificial intelligence rather than human pilots in the cockpit. The concept spans a spectrum from remotely-piloted drones like the MQ-9 Reaper to fully autonomous combat jets capable of executing air-to-air and air-to-ground missions without real-time human input. DARPA's Air Combat Evolution (ACE) program demonstrated in 2024 that AI agents can dogfight against human pilots in modified F-16s, while the U.S. Air Force's Collaborative Combat Aircraft (CCA) initiative aims to field fleets of AI-driven wingmen alongside manned sixth-generation fighters like the F-47. The core question is not whether AI can fly combat missions — it already can — but when autonomous systems will be trusted to employ lethal force independently in contested airspace.

Why It Matters

The Iran-Coalition conflict has become the first major theater where autonomous and semi-autonomous air systems are shaping operational outcomes at scale. Iran's extensive integrated air defense network — built around S-300PMU2, Bavar-373, and dozens of mobile SAM systems — creates a high-threat environment where every manned sortie risks pilot losses. AI-piloted combat aircraft could absorb that risk entirely. The U.S. plans to acquire over 1,000 CCAs at roughly $25 million each, compared to $80–100 million per F-35. For a conflict burning through precision munitions at unprecedented rates, autonomous wingmen that can suppress air defenses, conduct ISR, and deliver standoff weapons without risking aircrew represent a potential game-changer. The race between AI-enabled offense and AI-enhanced defense will define whether air superiority in future Iran-type conflicts is achievable or prohibitively expensive.

How It Works

AI air combat systems operate through three integrated layers: perception, decision-making, and execution. The perception layer fuses data from onboard sensors — radar, infrared search-and-track, electronic warfare suites — with off-board feeds from AWACS, satellites, and networked wingmen to build a real-time tactical picture. Modern systems process this data using neural networks trained on millions of simulated engagements. The decision layer is where AI diverges most from human pilots. DARPA's ACE program used reinforcement learning — the same technique behind AlphaGo — to train AI agents that discovered novel combat maneuvers human pilots had never attempted. In the X-62A VISTA program, AI-controlled F-16s executed within-visual-range dogfights against human pilots, demonstrating reaction times measured in milliseconds versus the 200–300 millisecond human baseline. The execution layer translates decisions into aircraft control inputs: throttle, control surfaces, weapons release. The CCA concept pairs these autonomous platforms with manned fighters in a 'loyal wingman' configuration. A single F-35 pilot might command 2–4 CCAs, each carrying different payloads — one for electronic attack, another for air-to-air missiles, a third for strike munitions. The pilot sets mission objectives and rules of engagement; the AI handles tactics and flight dynamics. Critical to the architecture is the kill chain authority model. Current U.S. doctrine requires a 'human in the loop' for lethal engagement decisions. However, the speed of modern air combat — where hypersonic missiles close at Mach 5+ and engagement windows shrink to seconds — is pushing toward 'human on the loop' models where AI acts and humans monitor with override authority.

DARPA ACE — The Program That Proved AI Can Dogfight

DARPA's Air Combat Evolution program, launched in 2019, set out to answer a fundamental question: can artificial intelligence match human fighter pilots in within-visual-range aerial combat? By December 2023, the answer was decisively yes. The program's AI agents, developed by teams including Shield AI, EpiSci, and Lockheed Martin's Skunk Works, progressed from simulation to live flight testing aboard the X-62A VISTA — a modified F-16 at Edwards Air Force Base. The VISTA flights marked the first time AI controlled a tactical aircraft in air combat maneuvers against a human adversary. The AI demonstrated aggressive maneuvering within 2,000 feet of the opposing aircraft at combined closure rates exceeding 1,200 mph. Critically, the system operated under safety constraints — including altitude floors and g-limits — that it never violated, addressing the chief concern about autonomous combat: predictability under pressure. ACE's reinforcement learning approach trained agents through over 20 billion simulated flight hours — more experience than every human fighter pilot in history combined. The AI discovered energy management strategies and angles-of-attack transitions that surprised experienced test pilots. DARPA leadership called the results a transformative moment in aerospace, noting that AI agents achieved human-level performance in basic fighter maneuvers while operating within strict safety boundaries. The program's success directly accelerated CCA development timelines and informed the Air Force's decision to pursue over 1,000 autonomous wingmen.

Collaborative Combat Aircraft — The $25 Million AI Wingman

The CCA program represents the Air Force's plan to field autonomous combat jets at scale. In April 2024, the service selected Anduril Industries' Fury platform and General Atomics' design for Increment 1, with initial operational capability targeted for 2028–2029. The Air Force plans to procure 1,000–2,000 CCAs at a target unit cost of $25 million — one-third the price of an F-35A Lightning II. CCAs are designed as 'attritable' platforms — valuable enough to equip with advanced sensors and weapons, but affordable enough that losing them in combat is operationally acceptable. Each CCA will carry modular mission payloads: air-to-air missiles for defensive counter-air, precision strike munitions for SEAD/DEAD missions, electronic warfare suites for jamming, or ISR packages for deep penetration reconnaissance. The modular approach means a single airframe design can fulfill multiple roles depending on mission requirements. The operational concept pairs 2–4 CCAs with each manned fighter. An F-35 pilot sets objectives — suppress a SAM site or establish a combat air patrol — and the AI autonomously plans routes, manages fuel, selects weapons, and coordinates with other CCAs. The human retains authority over weapons release, but the AI handles everything up to that decision point. This force multiplication means a squadron of 12 manned fighters could project the combat power of 48–60 aircraft, fundamentally changing the calculus of attrition warfare against integrated air defenses like Iran's.

Iran's AI and Drone Counter-Strategy

Iran has pursued its own autonomous combat capabilities through a fundamentally different strategic lens. Rather than high-end AI wingmen, Tehran has invested heavily in mass-produced autonomous drones and loitering munitions. The Shahed-136/238 family represents Iran's approach to autonomous warfare: GPS/INS-guided one-way attack drones that require no pilot or operator once launched. Iran has produced an estimated 5,000–8,000 Shahed variants since 2022, supplying Russia, Hezbollah, Houthis, and Iraqi militia proxies. Iran's defense industry is also developing AI-enhanced air defense. The Bavar-373 system reportedly incorporates machine learning algorithms for target classification and threat prioritization, while the Arash-2 drone has demonstrated autonomous target recognition capabilities in exercises. Iran's strategy leverages quantity over quality: rather than building a few exquisite autonomous fighters, Tehran fields thousands of cheap autonomous munitions that can overwhelm and saturate defenses. This asymmetric approach directly challenges CCA-era thinking. A $25 million CCA designed to accompany F-35s into contested airspace faces a fundamentally different problem when the adversary fields $20,000–50,000 one-way attack drones in swarms of 50–100. The cost-exchange ratio inverts: instead of cheap drones threatening expensive manned aircraft, expensive autonomous platforms must contend with vast numbers of disposable ones. Iran's Houthi proxies demonstrated this dynamic in the Red Sea, launching over 200 drone and missile attacks against naval vessels costing $2–3 billion each.

The Autonomy Trust Gap — When Can AI Pull the Trigger?

The most contentious issue in autonomous air combat is not technology but authority. Current U.S. Department of Defense Directive 3000.09 requires 'appropriate levels of human judgment' over lethal force decisions. In practice, this means a human must authorize every weapons release — even from an autonomous platform. The challenge is that modern air combat increasingly occurs at speeds exceeding human decision-making capacity. Consider a representative engagement timeline: an Iranian Bavar-373 battery detects an incoming CCA, launches a Sayyad-4 interceptor traveling at Mach 4.5, and the CCA has approximately 8–12 seconds to detect the launch, evaluate countermeasures, decide whether to evade or continue the mission, and execute. A human on a satellite radio link adds 1–3 seconds of latency. In high-density threat environments with multiple simultaneous engagements, that delay is potentially fatal. The military is moving toward tiered autonomy: Level 1 (human in the loop — human approves every action), Level 2 (human on the loop — AI acts, human monitors with veto authority), and Level 3 (human out of the loop — AI executes within pre-approved parameters). Current CCA development targets Level 2 for most offensive missions, with Level 3 reserved for defensive reactions like countermeasure deployment and evasive maneuvering. The 2024 Political Declaration on Responsible Military Use of AI, endorsed by over 50 nations, establishes norms around human control but lacks enforcement mechanisms, leaving the autonomy boundary essentially self-regulated by each state.

Timeline — When AI Fighter Pilots Become Operational Reality

The path from DARPA demonstration to operational deployment follows a compressed but predictable timeline. CCA Increment 1 (Anduril Fury and General Atomics) is scheduled for first flight in 2025–2026, with limited initial operational capability by 2028–2029. These first-generation CCAs will operate primarily in permissive environments — conducting ISR, electronic warfare, and standoff strike while manned fighters handle the most contested missions. Increment 2, expected to enter development around 2027, will add air-to-air combat capability and deeper integration with the F-47 sixth-generation fighter. By 2030–2032, the Air Force projects fielding mixed formations where CCAs routinely fly combat missions in denied airspace. Full autonomous air-to-air engagement — AI versus AI — is projected for the 2032–2035 timeframe, though this timeline could accelerate significantly if adversary capabilities force adaptation. The Iran conflict is accelerating these timelines. The consumption of over 1,200 precision munitions in the first two weeks of Coalition strikes against Iranian air defenses demonstrated that attrition rates in near-peer conflict exceed Cold War projections. Each manned sortie into Iranian integrated air defense zones risks an $80–100 million aircraft and an irreplaceable trained pilot. CCAs operating at one-third the cost with zero pilot risk fundamentally change the attrition calculus. Senior Air Force leadership has stated that the Iran conflict validates every assumption underlying the CCA program and has requested accelerated production funding, potentially compressing the Increment 2 timeline by 12–18 months.

In This Conflict

The Iran-Coalition conflict has become an unexpected proving ground for autonomous air combat concepts. Coalition SEAD/DEAD operations against Iran's integrated air defense network — comprising S-300PMU2, Bavar-373, 3rd Khordad, and dozens of Tor-M1 point defense systems — have demonstrated both the lethality of modern air defenses and the limitations of exclusively manned airpower. In the first three weeks of operations, the Coalition flew an estimated 3,400+ strike sorties, losing 4 aircraft to Iranian SAMs and damaging 11 more. While these losses are militarily sustainable, each represents a pilot killed, captured, or requiring combat search and rescue — operations that themselves consume significant resources and risk additional losses. CCA-type platforms would eliminate the personnel risk entirely while providing additional sensor coverage and weapons delivery capacity. Iran's strategy of dispersed mobile launchers — particularly the 3rd Khordad and Bavar-373 on transporter-erector-launchers — creates a persistent hunt-and-kill problem ideally suited to autonomous platforms. CCAs could loiter over suspected launch areas for hours, fusing onboard sensors with satellite data to detect launcher movement, then prosecute targets within seconds of detection. The reaction time advantage of AI — milliseconds versus human decision cycles of minutes — directly addresses Iran's shoot-and-scoot mobile SAM tactics. Meanwhile, Iran's own autonomous capabilities, particularly Shahed-136/238 attacks on Coalition bases and Gulf infrastructure, have demonstrated that the autonomous air combat revolution is already underway — just not in the form Western planners originally envisioned.

Historical Context

Autonomous air combat has roots stretching back to the Vietnam-era QF-86 and BQM-34 Firebee drones used as decoys against North Vietnamese SAMs. Israel pioneered operational drone tactics in the 1982 Bekaa Valley campaign, using Samson decoy drones to trigger Syrian SAM radars before manned fighters destroyed them — essentially the CCA concept in primitive form. The 2020 Nagorno-Karabakh war demonstrated AI-assisted targeting when Azerbaijani Bayraktar TB2 drones, using machine vision to identify Armenian armored vehicles, devastated forces lacking modern air defense. Russia's extensive use of Iranian Shahed-136 drones in Ukraine from 2022–2024 proved that autonomous one-way attack platforms could achieve strategic effects against a modern military. Each conflict has accelerated autonomous combat development, with the current Iran-Coalition war representing the most sophisticated convergence of AI-enabled offensive and defensive systems to date.

Key Numbers

1,000–2,000
Planned CCA procurement quantity for the U.S. Air Force, representing the largest autonomous combat aircraft program in history
$25 million
Target unit cost per CCA — roughly one-third the $80–100M price of an F-35A, making combat losses economically sustainable
20 billion+
Simulated flight hours used to train DARPA ACE AI agents — more than every human fighter pilot in history combined
8–12 seconds
Typical engagement window against a Mach 4.5 Sayyad-4 SAM, leaving insufficient time for human-in-the-loop weapons decisions
5,000–8,000
Estimated Shahed-family autonomous drones produced by Iran since 2022, representing the mass-production approach to autonomous warfare
2028–2029
Projected initial operational capability for CCA Increment 1, with full autonomous air-to-air combat expected by 2032–2035

Key Takeaways

  1. AI has already proven it can dogfight at human-pilot level — the X-62A VISTA program settled the core capability question in 2023–2024, and the debate has shifted from 'can it work' to 'when do we trust it'
  2. CCAs at $25M each will multiply combat airpower 4–5x while eliminating pilot casualties, representing the most significant shift in air warfare economics since precision-guided munitions
  3. Iran's mass-produced autonomous Shahed drones represent the opposite end of the AI spectrum — cheap, disposable, and already combat-proven across four theaters of war
  4. The real bottleneck is not AI capability but the autonomy trust gap — DoD lethal autonomy policies and international norms will determine when AI wingmen actually employ weapons independently
  5. The Iran conflict is compressing CCA timelines by 12–18 months by proving that manned-only airpower faces unsustainable attrition and munition consumption rates against modern integrated air defenses

Frequently Asked Questions

Can AI actually beat human fighter pilots in a dogfight?

Yes. DARPA's ACE program demonstrated in 2023–2024 that AI agents can match and exceed human pilots in within-visual-range aerial combat. AI-controlled X-62A VISTA aircraft flew aggressive dogfight maneuvers against human opponents at Edwards Air Force Base. The AI's advantages include reaction times measured in milliseconds (versus 200–300ms for humans) and the ability to train on over 20 billion simulated flight hours, discovering energy management tactics and maneuvers that surprised experienced test pilots.

What is a Collaborative Combat Aircraft (CCA)?

A CCA is an AI-piloted unmanned combat jet designed to fly alongside manned fighters as an autonomous wingman. The U.S. Air Force selected Anduril Industries and General Atomics to build CCAs at roughly $25 million each — one-third the cost of an F-35. Each manned fighter would command 2–4 CCAs carrying modular payloads for air-to-air, strike, electronic warfare, or ISR missions, effectively multiplying a squadron's combat power by 4–5 times.

When will AI replace human fighter pilots?

AI will not fully replace fighter pilots in the near term — it will augment them. CCA Increment 1 targets operational capability by 2028–2029, initially handling ISR, electronic warfare, and standoff strike in permissive environments. Full autonomous air-to-air combat capability is projected for 2032–2035. Human pilots will likely transition from flying combat sorties themselves to serving as mission commanders overseeing multiple AI wingmen from manned platforms.

Does Iran have AI-powered combat drones?

Iran has developed semi-autonomous platforms rather than fully AI-powered combat drones. The Shahed-136/238 family uses GPS and inertial guidance to fly autonomously to pre-programmed targets but lacks adaptive artificial intelligence. Iran's Bavar-373 air defense system reportedly uses machine learning for target classification. Tehran's strategy emphasizes producing thousands of cheap autonomous munitions at $20,000–50,000 each rather than investing in sophisticated AI combat platforms, creating an effective asymmetric counter to Western high-end autonomous systems.

How much does a CCA cost compared to an F-35?

The target unit cost for a CCA is approximately $25 million, compared to $80–100 million for an F-35A Lightning II. This 3–4x cost advantage is central to the CCA concept: the platforms are 'attritable,' meaning they are valuable enough to equip with advanced sensors and weapons but affordable enough that combat losses don't cripple force structure. A fleet of 1,000 CCAs would cost roughly the same as 250–300 F-35s while providing far greater tactical flexibility and zero pilot risk.

Related

Sources

Air Combat Evolution (ACE) Program Overview and X-62A VISTA Flight Test Results DARPA official
Department of the Air Force FY2027 Budget Justification: Collaborative Combat Aircraft Program U.S. Air Force official
The AI Dogfight: Inside DARPA's Quest to Build Autonomous Fighter Pilots Aviation Week & Space Technology journalistic
Autonomous Weapons Systems Under International Humanitarian Law: Legal and Ethical Implications International Committee of the Red Cross academic

Related News & Analysis