The war in Ukraine has, over the past three years, transmuted into an arena where unmanned systems are no longer auxiliary but central. What began as improvised quadcopters for reconnaissance and jury-rigged strike jobs has become an industrialized, doctrinally embedded form of combat. The logic is simple and brutal: attritable machines substitute for scarce human lives, and mass production substitutes for precision. This is not merely a technological shift. It is a moral and strategic reorientation of war-fighting itself.
One unmistakable trend in 2025 is the scale and formalization of mass-produced strike drones, especially FPV kamikaze types. Kyiv’s procurement plan for 2025 announced ambitions to acquire roughly 4.5 million FPV units, a dramatic escalation from previous years and an explicit acknowledgement that quantity, low cost, and ubiquity are now battlefield virtues. That procurement decision codifies a wider industrial pivot: Ukraine is localizing and scaling the production lines that feed the front, and expects to rely on enormous numbers of cheap, human-piloted or semi-autonomous loitering munitions.
That industrialization is matched by a parallel push toward partial autonomy. Ukrainian developers and units are increasingly fielding AI-assisted navigation, automatic target recognition, and mission modules that reduce operator workload and allow drones to function despite electronic warfare and degraded communications. These are not fully autonomous killer robots making end-of-life decisions. Rather, the trend is modular autonomy: discrete software components that raise mission success rates, shorten operator training times, and make unmanned systems more resilient to jamming and signal loss. CSIS’s field work documents this incremental approach and highlights how autonomy is being adopted as a force-multiplier under tight human control.
On the other side of the ledger is an intensifying campaign of long-range, massed strike drones deployed by Russia. During late May 2025 Moscow launched one of the largest combined missile-and-drone barrages to date, a salvo that Ukrainian authorities framed as among the most massive of the war so far and that brought home the hard reality of saturation attacks. These operations exploit economies of scale, cheaper munitions, and deception drones used as decoys or electronic clutter. The operational aim is simple: overwhelm layered defenses and force defenders to exhaust costly interceptors on cheap targets.
The measures-countermeasures cycle is accelerating. Russian modification of Iranian-origin Shahed designs to increase altitude, jamming resistance, and payload has made those systems harder to defeat with short-range, gun-based mobility teams alone. Ukraine’s response has been plural: expanded electronic warfare arrays, integration of multisensor ISR and data fusion, and a nascent industry of interceptor drones and low-cost kinetic defenses that are intended to handle massed kamikazes without exhausting legacy missile inventories. The battlefield is therefore bifurcating into two manufacturing races: massed strike drones and massed affordable interceptors.
Operationally the consequences are profound. Commanders can now shape effects deep into rear areas; logistics hubs, airbases, and even strategic bomber shelters are no longer sanctuaries but targets. Ukraine has also demonstrated the offensive potential of this approach by conducting long-range raids and coordinated multi-domain operations that blend UGVs, FPV swarms, and supporting fires in localized autonomous-heavy attacks. That mix complicates traditional definitions of front and rear, and it complicates the moral calculus of targeting because the boundary between combatant infrastructure and civilian-adjacent systems grows porous in a long-range drone campaign.
Ethically the moment is uncomfortable. The widespread use of low-cost, human-flown FPV systems and semi-autonomous modules increases the number of actors capable of lethal violence and disperses decision-making. At scale, the proportionality and distinction principles of international humanitarian law are harder to guarantee when attritable systems are launched in the hundreds or thousands and when AI assists target selection. The danger is not only machine error but moral diffusion: responsibility becomes dispersed across operators, contractors, software maintainers, and political authorities. The technology amplifies existing dilemmas about accountability rather than resolving them. (This is not a techno-utopian problem. It is a governance problem.)
Strategically there are sober lessons for Western planners. First, air defense inventories optimized for high-value missiles cannot be the sole bulwark against attritional drone saturation. Second, cheap, rapidly produced solutions matter; Ukraine’s pivot to domestic production and modular autonomy is not an artifact of desperation but a rational adaptation to a new cost calculus. Third, proliferating autonomy and AI features on the battlefield will have downstream effects on doctrine, training, and legal frameworks that allied nations need to anticipate now rather than debate later.
What should observers watch for in the coming months? The tempo will depend on production rates and logistics more than on exotic breakthroughs: how many interceptors can be fielded per week, how resilient supply chains for microelectronics prove to be under embargo pressure, and how quickly software countermeasures to jamming are disseminated across platforms. Expect continued incremental AI integration, ever-larger Russian massed drone salvos aimed at attrition and terror, and an expanding Ukrainian industrial ecology that prizes volume, reparability, and modular autonomy.
Finally, a cautionary note. The technological fixes we celebrate are double edged. Automation and drones reduce individual soldiers’ exposure yet shift risk upward and outward, from the person who pulls a trigger to the system designer and the supply manager who keep the production lines moving. If we are to keep war accountable, the conversation about drones must be technical and ethical at once. We must measure success not only in targets struck but in whether institutions and laws have kept pace with the machines they unleash.