The 1–5 September 2025 session of the Group of Governmental Experts on Lethal Autonomous Weapon Systems in Geneva brought the familiar mixture of technical drafting, moral pleading, and geopolitical hedging that has characterised this diplomacy since the topic entered the Convention on Certain Conventional Weapons. The meeting was structured around the Chair’s “rolling text,” a modular draft organised into boxes that seek to capture core elements of any future instrument: characterisation of systems, legal frameworks, prohibitions or regulations, technical measures, and accountability.

On the surface the week produced incremental drafting work: revisions to Box I and fresh iterations on the text addressing the so called critical functions of these systems were tabled and discussed. The International Committee of the Red Cross submitted suggested language intended to clarify what functions are decisive for the danger posed by a system and for where human judgement must remain central. These are not trivial editorial notes. They are attempts to turn philosophical insistence about human dignity and responsibility into operational clauses that militaries, engineers, and lawyers can apply.

The political moment, however, was the joint statement delivered by a coalition of 42 states expressing readiness to move from drafting to negotiations on the basis of the rolling text. That statement matters. It signals a bloc of states that view the rolling text as sufficiently mature to underpin negotiating mandates rather than as mere background. Whether that momentum will prove decisive depends on the positions of a small number of pivotal states and on whether the Group can turn contested concepts into clear obligations rather than contested footnotes.

Civil society and expert participation continued to shape the week. Organisations such as Stop Killer Robots used the Box-by-Box structure to press for categorical prohibitions on systems that target people and for the removal of the qualifier “lethal” from the Group’s working language. Their submissions stressed the moral and practical dangers of delegating life and death decisions to automated processes, and insisted that issues of bias, discrimination, and unequal harms are not peripheral technicalities but central humanitarian concerns. Research monitors from SIPRI and other institutions reported carefully on what delegations actually debated versus what diplomats said to signal positions at home. These external observers help translate procedural progress into substantive meaning for publics and policy makers.

Two concepts repeatedly defined the lines of contention. The first is scope: what is an autonomous weapon system for the purposes of an instrument, and does the term need the qualifier “lethal”? Many delegations argued that lethality is an outcome and not a definitional property, while others insisted on maintaining the original mandate terminology. The second is the nature and locus of human control. Delegations debated not only whether humans must retain control but what “meaningful human control” or “context-appropriate human judgement and control” actually require in practice. These semantic debates are important because they shape technical requirements that engineers will have to satisfy if states adopt legally binding measures.

If one adopts a wider, philosophical view, the meeting epitomised a recurring paradox. The technology advances by exploiting complexity and opaque machine behaviour. Diplomacy seeks crisp, enforceable obligations. The rolling text is an effort to translate slippery capabilities into legislative categories that can survive both courts and battlefields. That translation can only succeed if negotiators bridge the gap between how systems are built and how states can credibly limit their use. Otherwise regulation becomes a ritual of good intentions with little constraining force.

Practically speaking, the GGE is racing a political calendar. Its mandate envisages a report to the CCW review process in 2026, and many participants signalled an ambition to conclude the Group’s work before that point. That deadline both sharpens incentives and risks compressing complex choices into compromise language that may be unsatisfactory in operational terms. The real test will be whether negotiators craft obligations that are specific enough to guide procurement, testing, and legal review processes without becoming so narrowly technical that creative compliance becomes trivial.

For engineers and developers, the GGE’s progress matters because international law and standard setting will shape procurement criteria and the contours of acceptable design. For ethicists and advocates, the content of Boxes III to V will determine whether the international community enshrines red lines against delegating lethal force to machines or whether it merely polices risk. For militaries, the text will influence doctrine on human-machine teaming and the tests required before a system can be fielded. Each community reads the same paragraphs with different operational languages and different incentives. The rolling text must therefore be assessed through interdisciplinary lenses if it is to produce rules that are both just and practical.

What should observers expect next? More iterations of the rolling text, informal consultations across capitals, and continued pressure from states and civil society to resolve the hardest issues: prohibitions on anti-personnel autonomous systems, clear definitions of critical functions, and enforceable accountability measures. The presence of a bloc ready to begin negotiations is promising. It is not sufficient. The work ahead is less about drafting clever language and more about designing compliance architectures: verification, legal review, and procurement rules that make commitments meaningful in the messy contexts where weapons are used. Until that architecture is specified, the moral claim that machines should not decide to kill will remain ethically compelling yet operationally fragile.

In the quiet between sessions, scholars and practitioners must resist the temptation to treat the GGE as merely a diplomatic theatre. The rolling text is a rare opportunity to convert ethical imperatives into legal instruments. If that conversion fails, we will have a stronger set of norms without enforceable teeth. If it succeeds, we will have begun to align technological capability with human responsibility. Either outcome will tell us a great deal about how a technologically saturated polity can still assert the primacy of human judgement in the gravest of decisions.