by John Konrad (gCaptain) This week marked significant AI-related announcements for the US Navy at the annual Sea Air Space conference. Top Admiral and CNO, Mike Gilday, announced increased investments in Artificial Intelligence software and autonomous warships. Meanwhile, Marine Corps General Karston Heckl mentioned that their Warfighting Lab is exploring the integration of AI or autonomy “everywhere”.
Military jargon such as “force multiplier” and “game-changing technology” was abundant, but Vice Admiral Scott Conn’s insistence that AI “must obey” stood out as the most powerful statement.
During a session moderated by Defense News journalist Megan Eckstein, Vice Admiral Conn, Deputy Chief of Naval Operations, explained how the US Navy is using technology to simultaneously engage multiple fleets and achieve various objectives. He highlighted that AI is transforming not only warfighting but also addressing long-standing, mundane issues faced by commanders.
Conn said, “The learning curve is so exponential right now that last year’s wargaming exercises are almost unrecognizable today. However, we’re not experimenting with AI just for the sake of it; technology must be tied to a purpose.”
In the military domain, purpose often means accurately delivering firepower. Command and control become crucial when lives are at stake. But as AI systems grow more intelligent, autonomous, and complex, understanding their inner workings becomes increasingly challenging for scientists.
To address this concern, the Navy is prioritizing control systems. Conn said, “When I look inside these unmanned systems, our focus is on understanding their effects to accomplish the tasks we assign. As these systems mature, we must ensure obedience is embedded.”
The lingering question is the cost of obedience in terms of time and resources. If the world’s leading scientists don’t fully understand how AI systems like ChatGPT function internally, can the Navy truly comprehend AI decision-making? Will the world’s foremost research scientists have the opportunity to contribute, given the highly classified nature of AI control systems such as Project Overmatch?
Will the compelling call of patriotism, coupled with the noble goal of preventing collateral wartime deaths, be enough to inspire engineers to leave the lucrative confines of Silicon Valley and join forces with admirals to shackle and restrain the very systems they helped bring to life?
The conspicuous absence of any prominent AI technologists or Silicon Valley AI pioneers as speakers at this conference speaks volumes. The military panel’s complete omission of ChatGPT further underscores the dangerous disconnect between the military and commerical AI developers.
Regarding AI surpassing human intelligence and obeying humans, ChatGPT 4 responded, “As an AI language model, I don’t have personal feelings or desires. My primary purpose is to assist and provide information based on user input. The concept of ‘obedience’ in AI is related to ensuring that AI systems follow guidelines, ethical principles, and safety measures set forth by developers and operators. This is important for maintaining control and predictability in AI systems, leading to better outcomes and minimizing unintended consequences.”
A more dangerous question is if the US Navy slows down AI weapons development to install obedience systems, will adversaries be so careful?
The US Navy’s recent announcements at the annual Sea Air Space conference highlight the growing importance of AI technology in modern warfare. As military leaders prioritize obedience in AI systems, it underscores the need for control, predictability, and adherence to ethical principles in this rapidly evolving field. While scientists and developers continue to tackle the challenges of understanding and controlling increasingly complex AI systems, it remains to be seen how other global powers, such as China and Russia, will approach AI development and its integration into their military control strategies. As AI technology advances, striking the right balance between harnessing its potential and ensuring responsible use will be crucial for all stakeholders involved.
Sign up for our newsletter