On Memorial Day we gather to remember the human cost of war. That human cost remains primary and absolute. Yet as the character of conflict changes, so too do the tools at the point of sacrifice. Remotely operated ground vehicles, unmanned aerial systems, and semi autonomous maritime craft now stand between explosive hazards and the soldiers, sailors, and airmen they protect. Their presence complicates how we tell stories about loss and protection. It also forces us to ask a deeper question: what does it mean to speak of “fallen” machines in the same breath as fallen people?

Explosive ordnance disposal robots such as the PackBot family and the TALON emerged in the early twenty first century as blunt but effective instruments that moved soldiers farther from harm and frequently averted fatalities on patrols and at checkpoints. These systems were not mere novelties. They were workhorses, doing thousands of EOD missions and altering patterns of risk for bomb technicians on the ground. When an operator moved a manipulator arm instead of a gloved human hand stepping into a booby trapped room, lives were saved even as metal and electronics were frequently destroyed.

That functional substitution is why some accounts refer to damaged or destroyed robots as “fallen.” The metaphor has rhetorical power. It compresses into a single image the idea that a machine absorbed damage that otherwise might have landed on a human body. But the metaphor can mislead if taken too literally. A destroyed UGV or an expended loitering munition is, in material terms, an object. It has no biography in the ethical sense. Its loss does not create a family bereft of a presence or a memorial service at which comrades grieve. The moral weight of Guardianship, of duty, of courage, remains with the humans who design, deploy, maintain, and sometimes order lethal action through these instruments. Our primary obligation on Memorial Day is to those humans. The machines merit recognition only insofar as they point us toward human decisions, sacrifices, and responsibilities.

Recent high intensity conflicts illustrate both sides of this equation. In the war in Ukraine, a massive proliferation of small unmanned systems has reshaped attrition economies and battlefield calculus. Tens of thousands of expendable drones and loitering munitions have been launched, intercepted, and lost in the course of operations. Those losses reflect a strategy that treats many air and ground unmanned systems as consumables. The scale is instructive. Where earlier generations of military robotics were scarce and costly, the current generation often trades durability for numbers, changing how we understand loss and how states value human life relative to machine attrition.

The pivot from expensive, long lived robotic platforms to mass produced, often one way systems has tactical advantage. It also sharpens ethical friction. When a robot is weaponized or used as the proximate agent of lethal force the provenance of responsibility becomes urgent. The 2016 Dallas police standoff is a cautionary instance. There, a bomb disposal robot was used to deliver an explosive charge that killed an active shooter. The decision was made to protect officers who otherwise would have faced grave peril. Yet the episode launched a necessary national conversation about when a machine should be used as an instrument of lethal force, and under what legal and moral frameworks such use is permissible. Machines can extend our reach into danger, but they do not erase moral agency. Humans remain the decision makers.

How then should we mark Memorial Day in an era of ubiquitous robotics? First, by keeping human loss at the center of remembrance. The names, the stories, and the unfinished obligations to families and comrades must be our focus. Second, by using the inventory of damaged and destroyed machines as a prism through which to examine policy. A burnt out EOD robot or an expended swarm drone is evidence of frontline choices. Counting machines is not an exercise in sentimental attachment. It is an opportunity to analyze doctrine, procurement, training, and whether a force has put too much or too little trust in automation. Third, by demanding accountability for the deployment of robotic systems that can take human life. If a machine is the proximate instrument of force, the chain of responsibility must be visible and subject to law, not obscured by the rhetoric of inevitability.

Finally, Memorial Day invites a posture of moral humility. Technology changes the modalities of harm without changing the fundamental human responsibilities that follow. We should salute the engineers and operators who risked their lives to keep others safe. We should remember those who paid the ultimate price with their lives. We should also scrutinize the systems that carry, channel, and sometimes normalize risk. Honoring the dead includes preventing future deaths. If the proliferation of expendable machines creates habits or doctrines that make commanders more willing to accept human risk, that is an ethical problem worth confronting openly.

On this Memorial Day we can allow the image of the “fallen robot” to remind us of what machines do for human beings. Let that image also remind us that the moral ledger remains populated by flesh and memory. The machines are not the heroes. The people who choose to go into harm’s way, who design safer systems, who refuse to treat other humans as expendable, those are the subjects of our remembrance. If we remember with that clarity, we both honor the dead and steward the living.