The Convention on Certain Conventional Weapons has become, in 2023, a theater for an intensifying ethical debate about the automation of lethal force. The Group of Governmental Experts convened twice this year in Geneva, reflecting intensified efforts to pin down concepts such as meaningful human control and to translate those concepts into concrete measures.
Those procedural efforts have not settled the deeper moral and political questions. After months of oral and written submissions, states remain divided about whether the answer should be a legally binding prohibition, a set of operational constraints, or reliance on existing international humanitarian law applied to new technologies. The record of statements and working papers presented during the GGE sessions shows a spectrum of positions rather than a convergence.
By November the Meeting of High Contracting Parties took stock of that division and decided to continue the Group of Governmental Experts into future sessions rather than to declare a finished consensus or to open formal treaty negotiations. This was an unmistakable procedural choice: more deliberation, but no decisive political settlement.
Civil society and humanitarian actors have reacted with frustration. The International Committee of the Red Cross publicly urged states to launch negotiations on a legally binding instrument, arguing that nonbinding arrangements and voluntary measures will be insufficient to address the legal, humanitarian, and ethical risks posed by increasingly autonomous weapons.
Campaigners pressed that critique further. Some civil society groups described the November outcome as hollow and expressed alarm at episodes during informal discussions in which access for nonstate participants was constrained. Those accounts underscore a practical tension in multilateral arms diplomacy: states prize closed deliberation for bargaining, while humanitarian advocates press transparency as an ethical imperative.
What is at stake in these exchanges is not only technical regulation but a moral account of agency and responsibility. If a system can select and apply lethal force with reduced human input, who bears moral responsibility for wrongful harm? If responsibility is distributed across designers, commanders, and algorithms, will existing legal categories suffice to produce accountability, or will they become semantic shelters that hide moral abdication? These are not academic puzzles. They are the conceptual scaffolding on which any durable rules must be built.
Technologists and military planners will point to benefits: sensors, faster decision cycles, and reduced friendly casualties in some scenarios. Ethicists and clinicians will counter that the act of killing carries a moral freight that cannot be fully outsourced to code, especially when contextual judgment about combatant status and proportionality is required. The CCW debates this year have therefore accentuated an enduring fault line between faith in engineered reliability and concern about dehumanizing delegation.
There is also a procedural truth that deserves attention. International law evolves when political will and moral clarity coincide. The CCW provides a forum capable of producing binding protocols, but it has historically required political catalysts to convert moral arguments into legal rules. The ICRC and many civil society organizations are trying to supply moral urgency. The question is whether enough states will convert that urgency into political risk taking and legal commitments.
A prudential proposal follows from that diagnosis. States should stop treating the CCW as a purely technical working group and instead recognize that the normative choices here are strategic choices about the future character of warfare. If the aim is to preserve human dignity and ensure accountability, then negotiators must frame provisions that are actionable: clear prohibitions where necessary, operational constraints that preserve human judgment, and verification or transparency measures that allow prospective users and victims alike to assess compliance. Absent that, the CCW risks producing text that is formally correct but practically toothless.
Finally, the moral argument will not be won by rhetoric alone. It must be buttressed by demonstrable standards for review, human-in-the-loop requirements where appropriate, and institutional mechanisms that attribute responsibility. The ethical urgency that animated 2023 is real. Whether it will translate into law depends on whether states accept that restraint in the face of technological capability is not merely an ethical preference but a strategic necessity if humanity is to remain the author of decisions about life and death.