Visitors to AUSA later this year will find more than metal and sensors on the exhibit floor. They will encounter a living debate about what it means to extend violence through machines and what it costs us to delegate risk. The hardware is important. The software and doctrines behind it will matter even more. In this preview I survey the robotic themes that industry and the services are likely to showcase, and offer a few cautions for those who come to admire engineering and leave believing a problem has been solved.
One clear trend is the rise of mission-level autonomy as the integrating idea. Firms such as Anduril have been pushing autonomy and orchestration software that tie swarms, loitering munitions, and conventional UAS into coordinated mission sets rather than treating each platform as an isolated sensor or weapon. Anduril’s recent demonstrations of its Lattice for Mission Autonomy at Army experimentation events show how a single operator can plan and task mixed teams of air and ground drones to pursue suppression or target-acquisition missions, a capability that will be central to many AUSA showings.
That software story will be visible in hardware too. Expect exhibitors to present systems where autonomy is embedded by design rather than bolted on as an afterthought. Textron and other platform makers have been demonstrating how established air and ground vehicles can be integrated with autonomy engines and third-party payloads to produce cooperative effects. These demonstrations, including Textron’s interoperability work with autonomy firms earlier this summer, point to an AUSA where multi-vendor teaming and plug-and-play mission kits are sold as operational virtues.
Loitering munitions and tactical strike effects will be prominent and unavoidable. AeroVironment’s Switchblade family is mature, combat proven in recent conflicts, and the company continued to win Army procurement funding in 2023. On the show floor expect Switchblade displays and mission demonstrations emphasizing low-cost, high-precision effects that can be launched from small teams or mounted on ground vehicles. The presence of these systems forces the community to confront rules of engagement, proportionality, and the ethics of delegated lethal decision steps in contested environments.
Unmanned aircraft will also be displayed in more flexible forms. AeroVironment’s Puma line, for example, was given new vertical takeoff and landing options in 2023 so that a traditionally runway-dependent small UAS can be launched and recovered from confined or austere sites. Expect vendors to emphasize modularity and ease of use; the operational pitch will be that better launch and recovery modes remove human friction from deployment. The question for operators and ethicists alike is whether making systems easier to employ reduces or increases the threshold for kinetic action.
On the ground the show will be a study in diversity: wheeled and tracked unmanned ground vehicles designed for logistics, reconnaissance, and direct fire; midweight robotic mules meant to move supplies; and legged platforms that blur lines between sensor carrier and potential weapon mount. Ghost Robotics’ quadrupeds have been exercised by U.S. services in perimeter security and demonstration events, and their presence at professional exhibitions signals a growing interest in legged mobility for complex terrain where wheels and tracks struggle. Observers should look beyond novelty to the control interfaces, cybersecurity protections, and human override measures that accompany these platforms.
Taken together, the displays at AUSA will make two linked arguments: first, that autonomy and robotics materially alter tactical calculations by dispersing sensors and effects across many low-cost nodes; second, that the human remains central as a decision-maker — at least rhetorically. The tension between those claims is the core political and moral issue of the next decade. When autonomy shortens the sensor-to-shooter loop, who accepts responsibility for mistaken targeting, for collateral damage, for mission creep born of capability rather than necessity? Exhibitors will promise reliable rules, safeguards, and human-in-the-loop options. Auditors, ethicists, and the services must press those promises for detail.
Practically speaking, what should attendees look for on the floor? Inspect the user interface and the human machine boundary. Ask to see the failure modes and what the system does when comms are denied or sensors are spoofed. Probe whether autonomy is explainable and whether logs provide a reconstruction suitable for after action review or legal scrutiny. Hardware impresses the eye; interface and accountability design decide whether hardware should ever be used in earnest.
AUSA will be a useful temperature check. Expect impressive demos and glossy brochures. My plea to engineers and generals alike is to keep a clear separation between demonstration and doctrine. Demonstrated capability is not the same as the readiness of organizations, legal frameworks, or supply chains to field that capability ethically and sustainably. We should celebrate ingenuity while insisting that new tools enter the force only with commensurate changes in training, law, and command responsibility. The machines will arrive faster than the policies. Treat that speed as the problem it is, not as an excuse to admire the machines and forget the people they affect.