A few weeks before the AUSA Annual Meeting & Exposition convenes in Washington, D.C., robotics and autonomy will once again be framed as both the solution and the problem for modern land forces. The show, scheduled for early October, remains the premier exhibition where industry, service acquisition officials, and soldiers trade demonstrations, claims, and sharp questions about what autonomy will actually deliver for the tactical edge.
From where I stand the most interesting material in the AUSA robotics lane is less hardware novelty and more the slow consolidation of software, interfaces, and doctrine that determine whether robots become reliable teammates or expensive liabilities. The Army opened a formal competitive path for Robotic Combat Vehicles earlier this year, signaling institutional intent to field unmanned systems that operate alongside manned formations. That programmatic context shapes what to look for on the show floor: modular autonomy stacks, common controllers, and doctrine-friendly human-machine interfaces rather than mere platform parades.
The commercial and organizational shifts are telling. In late August industry watchers learned that a major small UAS and robotics integrator would acquire an AI-enabled control specialist, a move meant to weld common control software to ubiquitous unmanned platforms. The acquiring firm promoted the idea that a common, AI-enhanced controller can reduce cognitive burden for operators and accelerate cross-platform integration. Expect vendors at AUSA to emphasize this narrative: interoperability through a single pane of glass.
Tomahawk Robotics, whose Kinesis control ecosystem has been described by the company as an AI-enhanced common controller and which has recently introduced integrated mission-suite products, exemplifies the commercial work being showcased this season. Where once attendees admired exotic platforms, they will now ask which software ecosystems those platforms run, and whether those ecosystems make sense for sustained operations.
That shift from platform to system-of-systems is healthy, but it exposes three uncomfortable truths that the exposition will not be able to hide.
First, autonomy is not a plug and play solution. Sensors, navigation, perception stacks, communications, and mission software must be integrated under realistic constraints of contested communications, degraded GPS, and maintenance cycles. The press photos at industry briefings can hide years of edge-case tuning that only soldier testing reveals. Showing an autonomy demo on a cleared lot is not the same as operating in a sensor-degraded, adversary-filled environment.
Second, human factors remain the Achilles heel. Common controllers and AI-assisted displays can reduce operator load, but they also create a single point of cognitive failure. If a commander relies on opaque automation to prioritize targets or route a formation, accountability and trust fray quickly when the automation errs. The conversation at AUSA should move beyond faster keyboards and toward explainability, predictable failure modes, and meaningful soldier touchpoints.
Third, economics and sustainment matter more than novelty. The Army has invested in rapid prototyping and experimentation, but fielding at scale means logistics, repair, software sustainment, and training budgets. Robots that require bespoke spares and scarce specialists will be judged harshly by any force responsible for long campaigns rather than short demonstrations.
So what should a discerning attendee look for on the floor? Prioritize demonstrations and briefings that show:
- Common control architectures that already have field integrations with multiple platforms.
- Clear descriptions of degraded-mode operations, including how systems behave with intermittent comms and contested GPS.
- Human-machine interfaces that prioritize task allocation rather than simply automating decisions.
- Explicit sustainment plans, including spare parts rationalization and training pipelines for maintainers and operators.
My expectation is that AUSA 2023 will deliver an abundance of confident marketing and a narrower set of genuinely useful advances. Acquisitions and integrations in the market are forcing suppliers to answer the practical engineering questions that were once deferred under the banner of innovation. That is progress. Yet progress will mean fewer headline-grabbing autonomous feats and more incremental, harder-to-photograph work: control standards, cyber resilience, and the ergonomics of command. If the exhibition debates those topics in public, then the robotics sessions will have earned their place.
Finally, attendees should resist two seductive but dangerous narratives: first, that autonomy alone can substitute for doctrine and training; and second, that more automation necessarily reduces responsibility in operations. Technology amplifies human intent. It does not erase moral choice or strategic judgment. If AUSA 2023 uses glossy demos to avoid those questions, then the show will have failed its most important test. If, instead, vendors and military leaders use it to ask how to integrate autonomy responsibly, then it will have been worth the trip.