We treat Black Friday as a ritual of consumer appetite. In late November 2023 that appetite has a darker counterpart: states and firms buying not televisions or shoes but systems that can sense, decide, and kill at distance. The recent cadence of events makes clear that what looked like discrete technical advances during the last decade has now become a systemic acceleration of military automation with profound moral and strategic consequences.
Consider a single, stark datum. On November 25, 2023 Russia launched a massive barrage of Iranian-designed Shahed loitering munitions against Ukraine, a strike described by multiple outlets as the largest drone attack on Kyiv since the full-scale invasion began; Ukrainian air defenses reported shooting down the vast majority of the incoming drones.. That episode is not an isolated spectacle. The conflict in Ukraine has functioned as a laboratory where low-cost, semi-autonomous aerial systems and improvised FPV attack drones have been produced, modified, and fielded at a tempo that would have been unthinkable in peacetime. The result is a multiplication of capabilities at marginal cost and a lowering of the threshold for lethal force.
Those battlefield realities are colliding with shifts in policy and procurement. The U.S. Department of Defense updated its guidance on autonomy in weapons systems in January 2023, reissuing DoD Directive 3000.09 to reflect technological change while insisting on requirements for human judgment, testing, and ethical alignment in weapon systems with autonomous functions.. At the same time legislators and acquisition officials are funding rapid buys of counter-drone and autonomous systems. For example, in 2023 U.S. Special Operations procurement documents record a Counter Unmanned Systems procurement acceleration item that lists Anduril Industries among vendors receiving fiscal year 2023 funding.. The Pentagon is therefore moving on two tracks: tightening internal doctrine about autonomy while buying and deploying more autonomous and automated capabilities.
Commercial winners and start-ups are key intermediaries in this dynamic. Silicon Valley style rapid iteration, combined with defense budgets that prize speed over slow bureaucratic acquisition, produces both ready capability and brittle assumptions about reliability, resilience, and ethics. This has a simple arithmetic: cheaper sensors, commoditized software stacks, and the global availability of airframes mean that lethality can be scaled quickly. That scaling invites competition among state and nonstate actors who seek asymmetric advantages and plausible deniability.
Technology alone does not determine outcomes. Geopolitics and economics shape how these tools are produced and used. The Biden administration tightened export controls on advanced AI chips in October 2023 to limit the flow of high-end semiconductors that might accelerate adversary capabilities. Those controls aimed to close loopholes that allowed modified chip variants to be exported to restricted destinations. The policy is itself evidence that governments fear unchecked diffusion of the compute that powers autonomy..
Meanwhile diplomats and civil society are warning of a wider arms race in autonomy. At the United Nations First Committee in November 2023 a broadly supported draft resolution called for further study and input on lethal autonomous weapon systems, explicitly noting concerns about an emerging arms race and the humanitarian and legal challenges these systems pose.. The contrast between technological momentum and the slow churn of international norm development is striking. States worry about losing operational advantage if they accept strict limits; campaigners worry that without limits the technology will proliferate to actors who will not apply legal or ethical constraints.
This place in time produces a set of practical anxieties and moral questions. First, cost asymmetries favor proliferation. A low-cost loitering munition or an FPV attack drone can be assembled or bought for a small fraction of the cost of the systems it threatens. The economics push militaries and advertisers of defense tech toward volume. Second, autonomy shifts accountability. If a sensor, an algorithm, and a communications link together decide to engage a target, who is responsible when things go wrong? The updated DoD directive tries to answer that at the departmental level by requiring human judgment and testing, but doctrine does not neatly translate into practice in the fog of combat. Third, dual use complicates governance. The software libraries, compute platforms, and even sensors that enable autonomy are broadly useful in civilian industries, which resists blunt export bans and complicates any attempt at wholesale prohibition.
There are three immediate policy imperatives. One, transparency and reporting. Governments and vendors should make public, in classified and unclassified forms, the safety testing, failure modes, and field performance of deployed autonomous systems. Without empirically grounded transparency we will operate under dangerous illusions about reliability. Two, robust, interoperable defensive systems. As offensive autonomy proliferates, defending populations and forces requires investment in detection, resilient communications, and electronic warfare that can interpose human judgment where algorithms would otherwise act alone. Three, international political work. The UN resolution in November 2023 is a start, not an end. States should pursue a pragmatic agenda that narrows the worst risks: agreed norms about meaningful human control, export practices calibrated to reduce battlefield escalation, and cooperative monitoring to limit the speed of destabilizing deployments..
There is an ugly phrase that has reappeared in policy circles: the speed of war is accelerating. That speed is not merely about faster weapons. It is about shorter decision loops and a diffusion of agency away from human beings into networks and devices whose behavior is shaped by data, cost pressures, and the incentives of firms and states. If November 2023 teaches us anything it is that the robotics arms race is not a remote scientific problem. It is now an operational, political, and ethical emergency. The temptation to treat these systems as slide-rule economics or technical optimization must be resisted.
The central paradox is this. Robotic systems can reduce human risk by performing dull, dirty, and dangerous tasks. At the same time those same technologies make lethal force cheaper, more scalable, and more anonymous. The challenge for states, scholars, and citizens is to design institutions that preserve the life-saving potentials of autonomy while constraining its capacity to make war easier to initiate and harder to control. This will not be solved by one treaty or one procurement decision. It requires sustained attention to doctrine, economics, and ethical design principles, and a public conversation about what kinds of violence we will and will not automate.
If you are tempted to treat the last weekend of November as only a shopping season, remember that elsewhere the price tags are measured in policy choices, lines of code, and lives. The bargains we strike now will set the terms of risk for decades.