mikerickson reviewed Army of None by Paul Scharre
-
4 stars
This book covers so many topics that I don't even really know where to begin with. Bearing in mind that this predated the invasion of Ukraine and real-world application of drones in combat zones by about four years, there's still a lot of relevant discussion in this book that make it worth reading if you have even a passing interest in technology.
After a semantic discussion of what an "autonomous" weapon really is (technically a landmine counts if you want to get cheeky about it) we pivot through various topics: highly experimental DARPA projects, cyberspace attacks and computer viruses, generative AI, scenarios that require faster-than-human decision making, what non-state actors are capable of and what future terrorist attacks might be possible, the psychology and efficacy of human soldiers vs. automated systems, and a historical analysis of why some weapon bans were successful and while others were not. We cover a …
This book covers so many topics that I don't even really know where to begin with. Bearing in mind that this predated the invasion of Ukraine and real-world application of drones in combat zones by about four years, there's still a lot of relevant discussion in this book that make it worth reading if you have even a passing interest in technology.
After a semantic discussion of what an "autonomous" weapon really is (technically a landmine counts if you want to get cheeky about it) we pivot through various topics: highly experimental DARPA projects, cyberspace attacks and computer viruses, generative AI, scenarios that require faster-than-human decision making, what non-state actors are capable of and what future terrorist attacks might be possible, the psychology and efficacy of human soldiers vs. automated systems, and a historical analysis of why some weapon bans were successful and while others were not. We cover a lot of ground to say the least. In some ways the future being painted isn't as bleak as, say, a hypothetical nuclear exchange would be, but in other ways it is: specifically because loosing a killer drone swarm in a highly populated area wouldn't be perceived as being as destructive as a literal nuclear explosion, you could argue it's more likely to be used.
While I don't feel like any specific agenda was pushed on the reader, there is a slight tendency to focus on the failures of technology as warning cases. Stock trading algorithms causing the 2010 flash crash and the Patriot missile system accidentally downing an allied jet fighter at the beginning of the Iraq War come to mind. But even extrapolating real-world errors to future potential scenarios, this doesn't come across as irrational fearmongering. If anything, I'm coming away from this feeling like nations are aware there are as-yet unknown risks to developing weapon systems that remove humans from the loop. But they also feel like they have no choice but to throw caution to the wind and figure out this technology along the way for fear of being left in the dust by their rivals, consequences be damned. The optimists say this will result in robots fighting robots and result in fewer human casualties, but I can't help but feel this will widen the divide between the haves and the have nots of military capabilities, to the detriment of those already at the back of the pack.