The Convention on Certain Conventional Weapons Group of Governmental Experts concluded two sessions in Geneva in 2023 and adopted an advance version of a final report that framed the immediate legal landscape for lethal autonomous weapons systems. The Group, chaired by Ambassador Flávio Soares Damico of Brazil, met on 6–10 March and 15–19 May and produced a short set of conclusions that reaffirm the full applicability of international humanitarian law to emerging autonomous weapon technologies.
The GGE’s central legal baseline is straightforward but consequential. The report states that IHL continues to apply fully to potential developments and uses of lethal autonomous weapons systems. It goes further: systems that cannot be employed in a manner compliant with IHL must not be used. The Group also emphasised that States bear lifecycle responsibilities, including limiting target types, constraining the duration and scope of operations, and ensuring appropriate training for human operators. Those are not mere platitudes. They are obligations that, if taken seriously, would force doctrinal and procurement changes in armed forces that are developing high levels of autonomy.
Procedurally the report is modest. The GGE recommended that the High Contracting Parties decide by consensus on modalities for continued work, effectively leaving the question of a binding international instrument open for further negotiation. The adoption of the report therefore represents progress on clarifying normative contours while stopping short of launching formal treaty negotiations. That dual outcome is likely deliberate, reflecting both the political reality of deep divisions and the legal prudence of consolidating shared understandings before drafting binding text.
Those divisions are not abstract. The 2023 sessions received a slate of competing written proposals. Several Latin American and other states tabled a draft protocol proposing prohibitions and regulatory measures, while other groups offered text that emphasised compliance with existing law and advocated non-binding measures or restricted prohibitions. Civil society and expert delegations were present and active during the sessions, pressing for clearer prohibitions on anti-personnel autonomous systems and for stronger definitions of ‘‘human control.’’ The pluralism of submissions illustrates both the normative urgency and the diplomatic friction that surround LAWS.
From a conceptual standpoint the GGE report hints at what practitioners have begun to call a two-tiered approach. In plain terms, one tier identifies systems that are plainly incapable of lawful use and therefore ought to be prohibited; the other permits regulations, limitations, and best practices for systems that can be constrained to lawful employment. The risk of this framing is that the ‘‘prohibited’’ category may be drawn narrowly while the ‘‘regulated’’ category swells to accommodate widely deployed, potentially harmful systems. What emerges then is a legal veneer that leaves broad swaths of autonomous lethality subject to permissive operational parameters rather than categorical restriction. The GGE’s careful phrasing leaves space for either outcome.
I offer two observations for readers who care about both the ethics and the engineering of this field. First, meaningful human control remains a productive but underspecified normative anchor. Translating vague criteria such as predictability, explainability, and human judgement into testable technical requirements is an engineering task as much as a legal one. If States want enforceable norms rather than aspirational language, they must fund and define technical benchmarks, verification procedures, and audit trails for autonomy in weapons. Second, lifecycle obligations are the most promising lever in the short term. Requiring legal review, transparent testing regimes, and operational limits tied to verifiable metrics creates friction against irresponsible deployment while leaving space for legitimate defensive uses that can demonstrably comply with IHL.
Civil society reactions illustrate the political stakes. Humanitarian actors and campaign coalitions urged stronger prohibitions and lamented the absence of a mandate to begin treaty negotiations. Their critique matters. International law often evolves where moral outrage meets persistent advocacy and technical demonstration. The GGE report provides a platform. Whether that platform becomes scaffolding for binding restrictions or a catalog of non-binding best practices will depend on politics, not on principles alone.
In short, the 2023 GGE outcome is both a clarifying and an ambivalent moment. It cleans the normative slate by affirming IHL and lifecycle responsibilities. It leaves unresolved the choice between a narrow prohibition regime and a broader regulatory architecture. For scholars and policymakers the immediate task is practical: specify the technical benchmarks that make ‘‘meaningful human control’’ operational, develop credible verification and legal review processes, and build coalitions that can translate humanitarian concern into durable legal obligation. Without that work, the carefully worded conclusions of 2023 risk becoming deferred action on an issue whose consequences will not wait.
If there is a philosophical lesson here it is this. States can tidy law and principle into reports. Only when those texts produce constraints on behavior do they become moral instruments rather than rhetorical devices. The Geneva sessions of 2023 advanced the conversation. They did not, however, settle the question of whether humanity will place guardrails strong enough to restrain the automation of killing. That remains a choice, and it will be made by politicians and technologists together.