Blogs

Planes
Image source: https://www.defenseone.com/technology/2018/11/us-militarys-drone-swarm-strategy-just-passed-key-test/153007/
4 June 2023

Artificial Intelligence, Drone Swarming and Escalation Risks in Future Warfare: [Recommended Innovation Articles (and Commentary) #30]

by :

Original post can be found here: https://benzweibelson.medium.com/artificial-intelligence-drone-swarming-and-escalation-risks-in-future-warfare-recommended-5ae84208f2ac

Today’s article focuses on AI, drone swarming, and what could upset the traditional nuclear deterrence and “MAD” context nuclear armed adversaries maintain. The article is titled “Artificial Intelligence, Drone Swarming and Escalation Risks in Future Warfare” and was authored by James Johnson. It is in the RUSI Journal (2020), 165:2, pp. 26–36. The link to it is here:

https://doi.org/10.1080/03071847.2020.1752026

So, yes, there will be an academic paywall but if you use your librarian, you should be able to secure a PDF. Additionally, I used this article and many others in my own AI and warfare articles that just published in the latest issue of the USMC’s Journal of Advanced Military Studies (JAMS) which does NOT have a paywall. You can get those articles right here by going to this website and clicking on the latest issue (Spring 2023, Vol 14, №1):

https://www.usmcu.edu/Outreach/Publishing/Marine-Corps-University-Press/MCU-Journal/

Autonomous weapon systems rely upon specific, not general AI (specific does one thing much better than humans, like wining at chess or Jeopardy; general AI would compete with the smartest humans at everything humans conceptualize), and those systems remain in a human-machine teaming relationship with the human “on the loop.” Systems do what they are programmed to do, and a human operator still makes the lethal engagement decision- even if it is done in advance before a system is launched.

Swarm autonomous AI could, in the decades to come, create new, faster, more dynamic security contexts that threaten nuclear nation stability and deterrence in a wide number of ways that the author goes into. There is more of a focus on missiles and submarines in this piece, so for space-oriented applications, we might extend the author’s work here into any space related targeting cycles (see p. 32 on that), and the potential for nuclear equipped space systems in orbit; as well as future swarm based space autonomous weapons. This could cause a nuclear nation with limited weaponry/deterrence to “use or lose” as the author frames it, potentially shifting the existing nuclear coexisting situation to something horrific. The ethics, morality, and legal considerations in war also are under reexamination here with the rise of potential full autonomy AI systems working in swarms.

On p. 29, the author dives into how future AI systems, if fully autonomous and our militaries find ways to address clear ethical, legal, and moral considerations, this may be the next major transformation in coming years for conflicts:

“Several prominent researchers have opined that, notwithstanding the remaining technical challenges, as well as the legal and ethical feasibility, it is likely that operational AWS could be seen within a matter of years. The moral and ethical considerations related to the use of autonomous control weapons and autonomous targeting is complex and highly contested; humans creating autonomous control technology to attack a humanis inherently problematic.” (Johnson, p. 29)

I recommend this article as it is well researched, with extensive citations and plenty of meat on the bone for those seeking papers on AI that go beyond the superficial “look out for Skynet” sort of stuff popular on Twitter and junk online military journals seeking clicks. This article is valuable for any PME program with modules or semesters focused on technology and future warfare.

You may also like

4 June 2023

Sticking to a Hateful Task- Resilience, Humor, and British Understandings of Combat Courage 1914–1918 [Recommended Innovation Articles (and Commentary) #28]

Read More
4 June 2023

Alignment: Artifacts & Rituals

Read More
4 June 2023

[Editorial] Designing in Complex Security Contexts: Enabling Frame Awareness through Sharks, Dollar Signs, and Police Badges

Read More