The Next Warrior | General News & Politics | Hudson Valley | Chronogram Magazine

Page 4 of 6



You say that much of what is written in human history is simply the history of warfare.
It’s creativity that has truly distinguished us as a species; allowed us to take our species to the stars, create art and literature. And now we’re now using our creativity to build this incredibly fascinating technology. Some people even argue that one day we’ll produce a new species or a next step in the stage for our species. But we also must be completely honest with ourselves—the reason we’re mainly doing it is because of war. And that’s really sad.

You write about “Singularity”—a qualitative advance where prediction of what comes next becomes difficult and all the rules change, in part because we are no longer making the rules.
The idea is that every so often, something comes along in history that rewrites the rules. And it becomes almost impossible for people living before that time to really have a good sense of the possibilities and dilemmas that people will be facing after it. The classic example is the printing press. If you were living before its creation and are shown this rickety contraption, you could not fathom that it was going to create mass literacy, the Reformation, The Thirty Year War that would leave half of Europe dead, democracy, or help lead to the liberation of women in society. It is simply not possible. Today, there is a belief that robotics, and most importantly, AI, will reach that point. We can try to predict some things, but if we are honest with ourselves, we know that stuff is going to happen that we can’t even figure out yet. That’s a singularity—a break point in history. The book mentions the really interesting and active debate among very serious people who think we’re going to reach it before most of us pay off our mortgages. Take gunpowder. Within the realm of war the rules of the game were a lot different before gunpowder versus afterwards. Robotics is akin to that, but maybe in a greater context historically because it doesn’t just change the “how” of war, it changes the “who” of war at the most fundamental level. We are living it through the breakdown of humankind’s 5,000-year old monopoly on the fighting of war. That’s a rather big deal.

You also write about advancement theory, a school of thought that explains how old paradigms are broken by people who look at the world in a fresh way; how brilliant people can do something that makes no sense to 99 percent of the population at the time, but later on seems like pure genius. You make clear that presently, the robotics field’s exponential growth, specifically as it pertains to war, absolutely lacks a doctrine. Add in rivalries between the Army, Navy, and Air Force, and what one commentator in the book called the military’s “attention deficit disordered” way of purchasing these systems. The war in Iraq is on. The next front is Afghanistan. It sounds like a mess—robotics stepping out of science fiction into an archaic, bureaucratic world, where no one is steering the ship of its development.
A great quote in the book from an Air Force officer that encapsulates the current situation, “There’s gotta be way to think this better. Right now it’s just ‘give me more.’” We’re not asking, “How do I do this better?” There are two layers to the doctrine issue. First, you have to create a system of thinking around it. That work is just starting in the military. There is the question, “Will you be able to pull it off while we’re still fighting a war?” The lesson of WWI is that it’s often tough to create doctrine during a war; it’s only after the fact when they figure it out. Second, there’s no one set doctrine. There will be a debate over the best way to use these systems with someone being right and someone being wrong. It’s not as simple as “Let me figure out how to do this.” It’s “Let me figure out the best way how to do this.” I’m worried that today’s US military’s “bigger is better mentality” regarding technology and how we develop and buy it could turn out to be completely wrong. Add in the current state of manufacturing in the US, and that of our science and mathematics education [system] and you have a pretty scary brew. I certainly don’t want to see America be the losers of this worldwide robotics revolution.

Plus what you call “open source warfare”—corporate control of robotics not just necessarily regulated either by a government or military.

Exactly. Open source warfare is just like what happened in the software world with open source software. War is no longer dominated by one or two major players and is not a space that a couple of superpowers, or even governments control. Non-state actors, large and small, from organizations like Hezbollah all the way down to individual rogue terrorists have entered. The scary thing with robotics is that it’s not aircraft carrier or atomic bomb technology where you need a huge industrial structure to build it. It uses a lot of commercial technology—you can even do it yourself. For approximately a thousand dollars you can build a drone at home that’s very much comparable to the Raven drone our soldiers use in Iraq. Scary things are created when you have this cross between the current war on terrorism and these new technologies coming in. It means you have a number of actors who are going to be able to access pretty dangerous technologies rather easily. We’ve already seen that. Hezbollah used four drones in its war with Israel. In war between a state and a non-state actor we have non-state actors using just as sophisticated technology. How does this empower a future Unabomber, let alone an al Qaeda-type organization?

Comments (0)
Add a Comment
  • or

Support Chronogram