Add to Technorati Favorites

Weekly Index
Research Sites
CALENDAR

  • Features
  • Categories
  • Resources
  • About

search ctlab

Last 100 Entries
« Studying War on an Infinite Battlefield | Main | Whither the Anti-Killer Robot Lobby? »
Monday
30Mar

Wired for ... Nuclear War?

I want to start my posting with congratulating Peter Singer on this book! It is indeed one of the best books I have read in years and a role model for turning a seemingly dry topic into a reading adventure. I wish that this style of writing would gain currency (especially in the German-speaking academic community)!

Reading Wired for War (WfW) I became curious about possible effects of the ongoing RMA on nuclear weapons as the persistent legacy of an earlier RMA and vice versa. Despite fresh initiatives towards “global zero”, nuclear reductions will take a very long time (and some even argue that the achievement of global zero is outright impossible, if not dangerous) and so the horizons of nukes and increasingly capable computers/robots will overlap. I was surprised that the book only briefly touched upon nuclear weapons, as I think that the coexistence of these two very powerful technologies raises important questions regarding 1) the likelihood of war and 2) the likelihood of nuclear weapons usage.

As for the first point, there appear to be two trends pulling into different directions: On the one hand, it has been argued that – under certain conditions – nuclear weapons make war less likely because decision-makers are aware of the tremendous costs that their usage entails. On the other hand, the features of robotics technology as portrayed in WfW suggest that war will be more likely in the future because leaders do not have to care as much about costs as they had to do before. Hence, the question arises whether nuclear weapons, or the fear of escalation to a nuclear level would (still) be a factor influencing the decision to go to war. I would argue that like it is the case today, the impact of nukes on the decision to got to war would probably still depend on whether and how many nuclear weapons the other side has.

As for the second point, it could be argued that the use of robots on the battlefield could also lead to a “return” of nuclear weapons on the battlefield. A decrease of the (human) costs of warfare could also lower the threshold for the usage of nuclear weapons (with lower yield), which appear to be superb anti-robot weapons anyway (EMP, heat, …).

Taking this thought one step further, could nuclear weapons, as a means to counter the technological edge of adversaries, turn into the “poor man’s weapon“ of the robotic age? I am aware that a counter-argument appears in WfW: one of the characteristics of the new RMA is that the technology is relatively easy to obtain. But would the temptation to see nuclear weapons as a great equalizer not still be the same as today?

It is also interesting to ask whether increasingly capable computers will also have an impact on the command and control of nuclear weapons. Future advances in computer technology will increase the pace of warfare and, as suggested in WfW, will eventually decrease the role of human beings in warfare. Will this development also affect nuclear command and control, or will it remain a stronghold of human control?

The increasing capability of computers could contribute to the avoidance of nuclear war. Bharath Gopalaswamy suggested, in a recent article in the Bulletin of the Atomic Scientists,that the proximity between India and Pakistan requires very fast decision-making in the case of an attack with missiles, which would only have a flight time of 8 – 13 minutes (with a 600 to 2,200 km range). Gopalaswamy goes on to argue that “(s)uch a short time period places stringent conditions on procedures for evaluating and verifying warnings. There would be no time to consult or deliberate after receiving this warning. In other words, any response would have to be predetermined, presenting a significant likelihood of accidental nuclear war from false alarms.” As it is impossible to significantly expand the time available for decision-making, it appears to be much safer to base a counterstrike or missile defense intercept on the calculations of a highly developed computer, which does not feel the stress of the situation, is able to evaluate the available information faster than any human analyst could, and carefully weigh the available options. But still, it sounds a bit like a doomsday machine…

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (2)

We now know that the Soviet Union had adopted a launch-on-warning posture during the Cold War—increasing the potential survivability of its nuclear arsenal, but with a corresponding greater reliance on (technological) warning capabilities--and greater risk of false alarm.

For those of you who don't know the story of the September 1983 false alarm, I suggest you have a look at the story of Lt. Colonel Stanislav Petrov. While the details remain much debated, the episode clearly highlighted the value of having human decision-makers in such a case.

Mar 30, 2009 at 22:22 | Registered CommenterRex Brynen

I am not sure whether the US-Russian case is comparable to the case of two nuclear adversaries. First, future nuclear dyads (comparable to the India-Pakistan) may be characterized by much closer a geographical proximity and thus considerably less warning time than the East-West conflict example. With warning times of a few minutes, there is no time to consult for consultations and stronger pressure to act. Second, future computer systems would be much more capable and reliable than the systems which have been used by Russia and the USA.

I was therefore wondering whether humans would be removed from the loop (in the distant future), because computers would be seen as more reliable decision-makers in situations of intense stress.

Mar 31, 2009 at 13:52 | Registered CommenterMartin Senn

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
|
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>