CyCon 2025 arrives this month under the banner The Next Step, convened by the NATO Cooperative Cyber Defence Centre of Excellence in Tallinn from 27 to 30 May 2025. The conference remains the rare venue where technical, legal and strategic conversations meet in the same forum, and that multidisciplinary posture makes CyCon the right place to interrogate what autonomy will mean for conflict.

Autonomy is no longer a thought experiment. It is an engineering choice that shifts risk, speeds decision loops and redistributes responsibility across software stacks, supply chains and operators. At its simplest autonomy reduces latency and human workload. At its least benign it amplifies a cascade of failures that can begin with a poisoned training set, continue through an adversary-engineered sensor deception, and end with a misapplied weapon effect or a critical service outage. These are technical failure modes but they carry legal and moral weight. CyCon has long organised its program to bracket such questions into discrete tracks - technical, strategic and legal - because the tradeoffs show up differently in each register.

Practically speaking there are three vectors I expect to dominate conversation at CyCon this year. First, vulnerability surfaces multiply as autonomy migrates into new substrates: maritime vessels, logistics networks, edge devices and cloud-enabled command systems. CCDCOE work on autonomous shipping has already highlighted how sensor deception, communications interception and supply chain compromise create distinct threat categories for autonomous platforms, which should give pause to anyone who treats autonomy as merely an algorithmic upgrade. Security-by-design is necessary, not optional.

Second, algorithmic decision support and delegated control introduce a brittle interface between law and action. States and militaries are increasingly deploying AI systems to assist planning and to optimise effects. But legal obligations do not float free of system architecture. If a human decision maker relies on an opaque recommendation from a model and acts, accountability becomes diffuse. The only robust mitigation is institutionalised legal review and transparent testing regimes that evaluate how systems behave under stress and adversarial conditions. CyCon’s format, which publishes peer reviewed proceedings drawn from an open call, is geared to surface that kind of rigorous, replicable work.

Third, the strategic logic of autonomy is not purely technical. Autonomy compresses time in which competitors can escalate or de-escalate. It incentivises pre-emption and automation of responses to perceived probing. That changes deterrence calculus. It also changes how coalitions must coordinate rules of engagement and risk tolerance. NATO-affiliated forums such as CyCon therefore function as much as architecture for common understandings as they do conferences. Expect debate this year to shift from whether autonomy matters to how alliances must govern it collectively.

For practitioners and policy makers who care about preserving moral agency in the age of machine speed I would offer three modest prescriptions going into CyCon. One, make legal review an operational requirement across procurement and fielding cycles. Two, insist on adversarial testing that is continuous and open to red teaming across national boundaries when possible. Three, treat explainability and human factors as first order engineering requirements rather than afterthoughts. These are not panaceas. They are scaffolding for a future in which humans remain legible actors within automated decision systems.

CyCon 2025 will not answer every question. It will do better. It will aggregate empirical papers, informed debate and operational perspectives in a way that exposes the inevitable gaps between promises and practice. In that exposure lies the only hope for sane policy: if we insist on rigorous scrutiny now we reduce the risk that autonomy becomes an unaccountable accelerant of harm later. The Next Step, in other words, should be prudence informed by technical sophistication and by a willingness to hold systems and organisations to account.