Beyond Body Counts

KELMAN, STEVEN

Beyond Body Counts_ Normal Accidents: Living with High-Risk Technologies By Charles Perrow Basic Books. 400 pp. $21.95. Reviewed by Steven Kelman Associate Professor of Public Policy, Kennedy...

...complexity increases the inherent riskiness of the enterprise...
...Where we are not dealing with the complex, generally safety features can be engineered to overcome risks, even if it may be very expensive or inconvenient to do so...
...It will be interesting to see how engineers react to this provocative book...
...Perrow argues that there are special reasons for concern about nuclear power (and genetic engineering, although here he admits to being on shakier ground, since less is known) that do not exist in the case of other risky activities like operating dams or mining coal...
...He analyzes different production processes along dimensions of what he calls coupling and complexity...
...Charles Perrow's book is done well...
...A core meltdown in a nuclear power plant, it has been suggested, could devastate entire communities and require long-term evacuation of large regions...
...In drawing this conclusion, he appears to disagree with defenders of nuclear power not so much in his judgment of how bad a core meltdown would be if it occurred, but on how probable a core meltdown is, Perrow is not impressed with data on the operating experience of nuclear power plants until now...
...When we segregate out the different technologies that have been used over time, he notes, we find we really don't have much experience yet to suggest that his views of the risk of catastrophe are incorrect...
...Auto and mining injuries take a regular toll of lives each year, while nuclear power has yet to kill anyone...
...To what extent can engineered safety features be designed to deal with the effects of surprise interactions, even if their causes are ill-understood...
...To what extent can engineering reduce the unexpectedness of interactions...
...a safety problem in one section will not generally spread to the whole assembly line, for there is time to turn off the system...
...Beyond the fact that complex systems have so far defied our ability to fully understand them, let alone protect against them, there is the problem that safety devices are themselves pieces of equipment requiring testing and maintenance...
...It has been said that atomic power plants need to be run with strict, centralized military discipline, whereby people are trained to unquestioningly and immediately follow the rules...
...Perrow makes another interesting observation in this regard...
...Although he never comes right out and says it, basically the conclusion of his argument is that continued operation of nuclear power plants is bound to produce a catastrophic accident (a core meltdown), and sooner rather than later...
...These are engineering questions, and I don't know enough to have any answers...
...To use the jargon he introduces, this is because nuclear power has both high catastrophic potential and high susceptibility to system accidents...
...author, "Regulating America, Regulating Sweden" This is an intriguing book about the operation of risky systems—nuclear power, airplanes, dams, petrochemical plants...
...Most assembly line manufacturing, by contrast, is loosely coupled...
...Perrow makes ingenious use of organization theory to discuss some of the reasons why one cannot completely rely on these measures...
...To what extent can the features of a nuclear power plant core that produce a meltdown be uncoupled from other features of the system where things go wrong...
...I am not always convinced by his arguments, but his work constitutes an important contribution...
...Nonetheless, its risks are potentially catastrophic partly because the victims of a nuclear accident would be mostly third parties, rather than only the direct participants in producing the power...
...But this obedience would be inimical to the initiative and independent thought necessary for coping with the unexpected...
...Perrow's discussion of the methods used by economists and others to assess risk (a field about which I know something) struck me as oversimplified if at times insightful, and I fear that this might also be the case with what he writes about subj ects I know less well...
...Perrow's paradigm of this is nuclear power...
...Consequently, safety devices may very well not work the moment they suddenly are needed...
...When an expert in one area applies his expertise where knowledge from another area is importantly involved, the danger is that he will no more be able to produce a complete and satisfying account of an issue than those in the discipline that has traditionally dealt with it...
...When a system is tightly coupled and complex, and when system accidents have catastrophic potential, the world is in for trouble...
...In a tightly coupled situation, once one part of a system breaks down there is little time to prevent the failure from spreading to the rest of it...
...It is in his discussion of systems accidents that Perrow uses organization theory to help us think about various kinds of risky operations...
...When done well, such an undertaking can provide rewarding new insights into the way the world works—perhaps the best examples over the last 15 years are Graham Allison's The Essence of Decision, applying organizational behavior theory to the study of international relations, and the whole body of literature applying economic theory to the study of legal rules...
...There is a high susceptibility to system accidents, Perrow maintains, if both tight coupling and complexity are present...
...Perrow's argument is arresting and serious, yet I am not certain I am convinced...
...Such an event would simply be much worse than the highway carnage, bad as that is...
...The accident at Three Mile Island, which ultimately involved a very serious risk of a core meltdown, does seem to support Perrow's position, at least on the operation of nuclear power plants up to the point of the incident...
...In speaking of catastrophe, Perrow seeks to direct our attention away from an oversimplified "body count" approach to danger...
...It is intriguing because it applies the insights of one discipline (organizational behavior) to the study of a problem typically approached through the lenses of another (engineering...
...In a complex production situation there can be unexpected interactions among the components of the system, because they are physically close together and we do not understand the process well enough to know all the ways a difficulty in one area may trigger trouble elsewhere that will result in a shutdown...
...Systems that are both tightly coupled and complex, Perrow contends, may therefore be caught on the horns of an organizational dilemma...
...While Perrow knows considerably more than I do, I doubt he has any answer either...
...For the chances of catastrophe were uncomfortably high, and escape was a lucky break rather than a near-certainty...
...Reviewed by Steven Kelman Associate Professor of Public Policy, Kennedy School of Government, Harvard...
...A quick, unhesitating response is, as Perrow notes, useful with tightly coupled systems, where time is of the essence in containing a problem before it spreads to the rest of the system...
...True, no core meltdown did take place, yet this is not reassuring...
...Here are some of the questions I ask myself: To what extent will continued experience with processes such as nuclear power or genetic engineering reduce the unexpectedness of interactions...
...Similarly, when a piece of information is subject to two interpretations, one suggesting a catastrophe and the other normal functioning, people will frequently choose the latter and not respond properly to the warning...
...That brings me to the weakness of this genre of study...
...nuclear power and recombinant DNA production are...
...One standard way of confronting situations having catastrophic potential is to proliferate fail-safe and assorted other security backups...
...This increases the number of possible risks and, above all, decreases the ability to eliminate them through engineered safety devices, since it is harder to develop devices to cover every contingency...
...Perrow tells of accidents where people reacted to flashing warning lights on a control panel by assuming they must be malfunctioning—a common occurrence—instead of interpreting the lights as a signal of something that had never happened before...
...Furthermore, significant psychological evidence exists that because component or system failures are rare, people tend to reinterpret the unfamiliar as the familiar...
...Perrow cites numerous instances of this in his discussion of actual accidents...
...Mainly, however, I think the designation relates to the probable magnitude of a disaster (Perrow is less clear on this than he might be...
...To what extent are there, or could there be, safety systems built to provide redundancies for safety component malfunction or predictable operator misinterpretations...
...Dams are an example...
...Dams, albeit tightly coupled, are not complex...
...Tight coupling tends to make the accidents affect the whole system...
...Yet since they are not mission-essential, they are often the first to be ignored if maintenance manpower is short...

Vol. 67 • May 1984 • No. 9


 
Developed by
Kanda Sofware
  Kanda Software, Inc.