We are accustomed to grieving persons, places, and animals. Increasingly we must also recognize grieving things. Military units now deploy robots and autonomous systems that do more than execute tasks. They accumulate names, histories, routines, and moral valence. When a UGV, an EOD robot, or an autonomous maritime asset is destroyed or lost, something more than fiscal accounting occurs. Operators, maintainers, and their units experience a form of bereavement that sits at the intersection of attachment theory, ritual, and technological lifecycles.
Psychologists and human robot interaction scholars have long observed that humans ascribe sociality and agency to artifacts that signal lifelike behaviour. Anthropomorphism is not merely a rhetorical flourish. It shapes the way we allocate attention, care, and moral concern to machines. Kate Darling and others have shown that the cues designers give a robot alter human responses to it and to its treatment in ways that mirror our reactions to animals and persons. That is to say, the human mind treats some robots as social actors even while it knows, cognitively, that they are machines.
Laboratory work confirms the emotional weight of failures when they are personally relevant. Experiments show that when a robot causes damage to something that matters to a specific user, trust and likeability decline more steeply than for generic failures. Personal relevance therefore amplifies the psychological consequences of loss and malfunction. In a military setting that amplification is predictable. A system that has shared missions with a crew, that has been entrusted to a particular team, or that has repeatedly assisted a soldier on patrol will be more personally relevant than an anonymous piece of equipment.
Field studies of social and companion robots give further evidence that absence can cause real distress. Longitudinal work with older adults using companion robots documents increased loneliness and behavioural changes when the robot is removed or ceases to function. Users often move through the classical stages of loss in idiosyncratic ways. Some recover quickly. Others show protracted distress. The lesson is not that machines are equivalent to human relationships. It is that machines can become embedded in affective and functional networks that matter to people. When those networks are severed grief follows.
Culture shapes how loss is made meaningful. In Japan, owners of Sony Aibo robot dogs have held ritual funerals for defunct devices, writing notes and seeking solace in ceremony. These public acts of mourning make visible what is otherwise private: a human need to ritualize an ending and reinstate continuity after a rupture. Rituals are not merely sentimental. They are psychological technology. They reframe loss, situate it within communal narratives, and create pathways to closure. Military units have their own rituals for lost comrades. As autonomous systems become quasi-members of teams, it is unsurprising that analogous ritual practices emerge around them.
There is an important conceptual pivot to make here. Some losses are instrumentally costly. Others are existentially costly. Financial accounting will register the former. The latter appear in nightmares, degraded team cohesion, and impaired readiness. For remote warfare this is already visible. Debates and reports since the 2010s document that remote operators suffer the same spectrum of moral injury and post traumatic stress as more conventional aircrew. The ability to see, name, and care for a target or an asset at distance does not inoculate operators from emotional consequences. If watching and directing a kinetic engagement can wound the soul, tending a damaged or destroyed robotic teammate can wound it as well.
Ethically and practically, there are three interlocking responses commanders and engineers should consider.
1) Design for exit and memorialization. Systems should be engineered with clear decommissioning pathways and with affordances that enable families and units to archive interactions. Designers of social robots have begun to ask how to build “good deaths” for machines, and grief researchers urge that we plan for machines leaving our lives as intentionally as we plan for their introduction. When loss is anticipated and acknowledged users can prepare and repurpose memories rather than be blindsided by erasure.
2) Recognize and normalize subordinate grief. Military medical and psychological services already treat moral injury, survivor guilt, and PTSD. Programs should explicitly include reactions to the loss of robotic assets. That includes debrief protocols that allow teams to tell the story of the loss, and to translate material failure into shared meaning. That translation reduces the risk that unresolved feelings undermine unit cohesion or decision making. The empirical literature on personally relevant failures suggests that acknowledgement and repair of relational ruptures is central to restoring trust.
3) Avoid legal and moral denial. There are movements in scholarship and law that argue for recognizing the social weight of robot loss, not because robots have rights, but because people suffer when they lose things that function as companions. Some commentators even raise difficult questions about whether acts of deliberate destruction ought to carry social sanctions beyond property law when they intentionally target objects known to be loved. Whether or not the law changes, military organizations should not reflexively dismiss expressions of grief as mere sentimentality. Doing so risks compounding harm.
I will close with a caution. Engineering efficacy and human fragility exist in tension. We invent autonomous systems to reduce human exposure to danger. We must not do so while erasing the very human networks that make life meaningful. Ignoring the psychological fallout of losing robotic assets is a strategic error. It undercuts morale, it obscures non material costs, and it narrows responsibility to spreadsheets. A wiser approach treats machines as part of a sociotechnical web. We must design for their failures and plan for their departures, and we must give crews the space to lament. That is the humane and the prudent course.