Radical Instrument

IT is changing the exercise of power. Radical Instrument is picking up the signals.

Robots and the systematization of war

leave a comment »

After a tortuous ride through blog-space (thanks to links on Opposed Systems Design and Kings of War), I came across this New Year’s Day post (featured on Entitled to an Opinion) featuring a 2005 presentation made by a former R&D head in the Israeli Ministry of Defense.

At the heart of the presentation is a “system collapse model” for Israeli anti-terrorism efforts, which is described as “mostly all maths” and a “thermodynamic model which analyzes the disorder inside the system.” It’s essentially the theory of Jenga:  the more components of a system that are damaged, the greater the probability of system collapse, even if the system’s “critical point” remains undamaged. 

It’s pure speculation to draw lines from this theory to the IDF’s 2006 and 2008 campaigns in Lebanon and Gaza. Still, one can’t help but be reminded of General James N. Mattis’ memo to U.S. Joint Forces Command criticizing “effects-based operations” (which this Air Force briefing describes as an approach that models an “Enemy as a System”):  

For example, a recent analysis of the recent Israeli-Hezbollah conflict found that EBO ‘terminology used was too complicated, vain, and could not be understood by the thousands of officers needed to carry it out.’ … Although there are several factors why the IDF performed poorly during the war, various post-conflict assessments have concluded that over reliance on EBO concepts was one of the primary contributing factors in their defeat.”

Here’s what’s troubling. There’s an argument to be made that the military’s growing use of robots – highlighted by P.W. Singer’s new book and debates over the implications for ethics and laws of war – represents the continuance of a mindset that applies “enemy as a system” thinking to war. General Mattis’ memo highlights the dangers inherent in this reductionism:  “…all operating environments are dynamic with an infinite number of variables…” Variables, in other words, that can’t be overcome with better or even adaptive programming. 

And that doesn’t even start to address the moral questions involved in systems models for warfare, or the ethical and legal questions associated with the application of these models to domains like homeland security … domains which will see technology manifestations of this thinking (e.g., TSA databases) just as we’re seeing with the growth in military robots. It’s not the technology that should trouble us. It’s the theory that defines how we use it. 

Post-script:   The background for the presentation featured on Entitled to an Opinion is intriguing. It was delivered at a Russian think-tank headed by a political theorist with possible links to far-right European and neo-Nazi groups. This same theorist, Sergey Kurginyan, also appeared on a panel with a U.S. Department of Homeland Security official at a 2008 “World Summit on Counter-Terrorism” in Israel. No conspiracy theory here – just a suspicion on the possible attraction of these systems models to the right-wing, and a comment that we should really watch who we associate with on the counterterrorism front. 

Written by Mark

February 10, 2009 at 11:21 pm

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: