The Development of Risk Analysis:

A Personal Perspective

ABSTRACT

This article reflects on my experiences observing and participating in the development of risk analysis for environmental and health hazards since the 1970s with emphasis on its critical role in informing decisions with potentially high consequences, even for very low probability events once ignored or simply viewed as "acts of God." I discuss how modern society wants to protect itself from hazards with limited or no immediate historical precedent such that prediction and protective actions must depend on models that offer varying degrees of reliability. I believe that we must invest in understanding risks and risk models to ensure health in the future and protect ourselves from large challenges, including climate change, whether anthropogenic or otherwise, terrorism, and perhaps even cosmic change.

Keywords: model, accident, low-dose, regulation, cancer, terrorism

1. INTRODUCTION

As society has developed we have tried to understand major events in the world. Those that happen regularly, the rise and fall of the sun, eclipses and the equinoxes, have been studied and used for prediction by many societies. Weather and climate are less predictable but are now being analyzed with some success. But risk analysis has tended to concentrate on the effect of man's technological actions. I will therefore start with a discussion of these.

Two hundred years ago the method of understanding the risks and dangers of society was to try out the technology and modify the technology if any problems arose.An obvious example was the development of railroad travel. About the year 1800, tramways were installed to guide wagons of freight, often in the mines, using flanged rails. These proved unstable that, by 1830, the flanges were installed on the wheels instead. But only recently have there been systematic studies of the “wheel-rail dynamics”. There were other problems with the railway. In 1833, the first passenger railway in the world, between Liverpool and Manchester, opened for business. On its opening day an engine ran down a man who failed to get out of the way. Moreover the man was a Member of Parliament! (Parenthetically it might be regarded a good thing if the member was from the opposite party.) Further developments in railroads and elsewhere led to accidents with tens and even hundreds ofpersons killed in a single accident. But still only the historical approach was used.

Mining and particularly coal mining was always regarded as dangerous but people engaged in it anyway. Two distinct types of danger became apparent two millennia ago. On the one hand miners breathed in noxious fumes of various sorts. These included mercury vapor in the open pit mines in Spain, and dust particles in underground tin mines in Cornwall. Secondly, there was the danger of collapse of the mines and entrapment of the miners. Even living in a polluted town was known to be dangerous. In the 19th century, a prayer in various northern English churches became: “from Hell, Hull and Halifax, good Lord deliver us.” As late as 1945, Halifax, UK,was often covered with a penetrating fog, and my paternal grandmother used to scrub her front door step twice a day. This air pollution seemed to many people to be an inevitable sign of prosperity. As a Yorkshire mill owner once said: “Where there is muck there is brass (money).”

Although control of these risks began many years ago, it is useful to consider the end of World War II as a defining point in man's desire and ability to control these hazards. The availability of cheap oil from the Middle East and gas from Algeria, the North Sea, Siberia, and in the US with hydraulic fracturing (fracking) of the gas containing rock, ushered in a period of prosperity and a fuel less polluting than coal to go with it.

2.1. Major Disasters—Historical Approach

Until World War II very few people concerned themselves with major disasters that could kill more than a hundred people at once. Yet they happened. Many large disasters throughout the world have been recorded and it is likely that many more lie unrecorded. The outbreak of plague in the 14th century killed 1/3 of the people in Europe, including Great Britain, and perhaps as large a fraction elsewhere. No one knows why the pandemic started and no one knows why it stopped. Yet the text books in English schools before World War II barely mentioned it. Floods, earthquakes and hurricanes occurred through weather and climate extremes. They were called: “Acts of God”. It is only since WWII that society has realized that while we cannot control “Acts of God” we can be prepared for them and can control man's reaction to them. This reaction before World War II already included tornado warnings, and construction of flood control levees. Yet only in the last couple of decades has society included definite disincentives such as unavailability of insurance, to put oneself out of the path of these natural disasters.

The extent to which mankind has deliberately built his civilization in dangerous ways is not always appreciated. The Nobel Laureate Eugene Wigner commented to me, in 1973, that “whenever there is a lot of energy in one place and a lot of people in the same place, there is a potential for disaster.” A look at the geography of the US shows that major cities have grown up around almost every estuary, because rivers seemed the easiest transportation routes. Yet this fact makes the cities liable to flood. A controllable flood can be a good thing, because it leads to fertilization of the land, but the flip side is the uncontrollable flood leading to disaster. In 1950, water in the Vaiont dam in northern Italy went over the top of the spillway as a landslide slid into the dam. Three thousand people downstream were killed. Man has also brought his industries to these same estuaries. The cyclones in Bangladesh in 2007 and Myanmar in 2008 caused enormous casualties in these coastal, estuary areas. In several estuaries, there are large 17 million gallon (about 64.3 cubic meters) tanks of such potentially dangerous substances, as liquefiednatural gas, each tank of which, on combustion, would release as much energy as five Hiroshima bombs, or of 17 million gallons of toxic chlorine which could, on release, kill many people.

One of the earliest recognitions that Wigner's potential hazards could be minimized was the proclamation in 1848 by the citizens of London that no petroleum products could be brought up the Thames River closer than 30 miles east of London Bridge.([1]) Over the next century, this led to the construction of a complex in Canvey Island.([2]) This included one hundred and fifty tanks each with 17 million gallon of flammable and toxic liquids.

In the 19th and early 20th centuries, engineers developed practical solutions to disasters. This was, in part stimulated by the insurance industry which, for example, developed standards for locating the large tanks as early as 1907. These tended to be based on the old historical approach:“make sure that what has been known to happen before will not happen again”, particularly if it happened frequently. In the process they began to use a formal procedure called “Fault Tree” analysis to analyze a system and discover what contributes to failures. Following the Canvey Report, which was formally requested by and presented to the British Parliament, the deputy director of the Health and Safety Executive,Dunster ([3]),gave an appeal for widespread application and discussion in 1980.

“Our use of risk assessments and our attempts to make quantitative comparisons between risks of alternative decisions such as energy sources together make it clear that we lack a great deal of necessary information. We know little about the real magnitude of many existing risks and still less about society’s attitude towards these ill-defined risks. I suggest that we need to attempt more risk assessment and that we need to publish more of the results. These results will only be one factor in the process of making decisions and indeed the existence of such studies may often make decisions more difficult to reach but eventually we should gain confidence that our decisions are being taken in a consistent and possibly even in a logical way.” (2)

2.2. Models, Uncertainty and Appropriate Caution

It is important to realize that even at this elementary stage any prediction demands a model: even if it is only “next year will be like last year” or “next year will be like last year with a few understood improvements”. Once a model is introduced, uncertainty enters. This uncertainty is not merely due to statistical sampling errors. It has been said that: ”all models are wrong but some models are useful.” The more complex situation and the smaller risks one is estimating, the more uncertain is the estimate. Critics might argue that risk analysts are not talking about science – but that is a matter of definition.

2.3. Formal Study of Large Accidents in New Technologies

But it was not until the development of nuclear electric power after World War II that the old paradigm:“try it and if it gives trouble, fix it” was deemed inadequate for a new technology. Society now demands evidence, in advance, that the technology is safe. A number of reasons have been suggested for the fact that this fundamental change first occurred for nuclear power and probably all are in some part responsible.

(1)- The new technology was in the hands of fundamental scientists from the start.

(2)- The new technology used new physical principles.

(3)- The new technology arose simultaneously with a new deadly form of war.

(4)- The new technology posed unprecedented hazards.

The change was particularly apparent in the United States. The USAtomic Energy Commission (AEC) was set up in 1946 as a civilian agency to encourage and control all uses of nuclear fission. Military use was subordinate to civilian control. The first nuclear reactors were for military purposes but as early as 1946 William Webster, later head of the New England Electric System was asked to report on the potential for nuclear electric power which he did orally. The first nuclear generated electricity was about 15 years later. Outstanding scientists were either on the Atomic Energy Commission or consultants to it. The names of Glenn Seaborg, John Von Neumann, Robert Bacher, Edward Teller, Eugene Wigner, and Richard Feynman come to mind as influencing the safety procedures. The commission established an Advisory Committee on Reactor Safeguards (ACRS) to advise on safety. Their advice is always respected. Right from the beginning the ACRS set up a procedure called by a name borrowed from the military, “Defense in Depth.”([4])One must imagine the worst thing that can reasonably go wrong in the reactor, the “Maximum Credible Accident” and then devise an engineered safeguard to prevent it happening. Large reactors, particularly the first in a series, were to be in unpopulated areas, following Wigner's principle.

There is little doubt that the “Defense in Depth” approach already led to a dramatic improvement in safety approaches, but by 1970 when many large nuclear reactors were under construction criticism arose. This was also a period of public questioning of government in the Vietnamese war. The next step came from Professor Manson Benedict of the Massachusetts Institute of Technology. It wasto consider what happens if something happens beyond the Maximum Credible Accident and to consider how often that might come about? Professor Norman Rasmussen was selected for this task, and within two years he and his study grouphad a landmark study - “The Reactor Safety Study.” ([5]) In this study he was the first to use an “Event Tree”. Starting from an initiating event, he followed the course of a potential accident taking account of the engineered safety devices mandated by the “Defense in Depth” approach,but his team calculated the availabilitythereof using a “fault tree”. In principle the initiating events can lead to hundreds of thousands of possible results, but for a nuclear reactor only those that can lead to melting of the reactor core are important.

Although the technique changed the whole understanding of risks of large facilities people were slow to grasp the significance. Too much attention was paid to the final number describing the effect on public health of an accident and too little to the calculation procedure itself. Nuclear power operators, looking at the numbers and comparison with other risks ([6]) merely said that “this proves that reactors are safe.” This was an inadequate response. At it’s best; the event tree analysis procedure helps them to understand the reactors, and to make safety and operational improvements. But in this a large fraction of the leaders in the industry failed, although a few understood the power of these methods early-on. Anti-nuclear groups criticized the numbers, in many instances correctly, but failed to acknowledge that even if the more pessimistic numbers were correct, nuclear power was one of the safer energy technologies. This was in spite of a positive reportby a review committee. Lewis et al.([7]) correctlycriticized detail but praised the procedure. The US Nuclear Regulatory Commission failed to adopt the Event Tree analysis procedure in its licensing of new reactors. But this began to change after the Three Mile Island accident in 1979.

Rasmussen's study group had studied a GE Boiling Water Reactor and a Westinghouse Pressurized Water Reactor but not a Babcock and Wilcox Pressurized Water Reactor. In an ideal world, Babcock and Wilcox would have followed up by a study of their reactor design using Rasmussen's method. So also would the owner and operator of each Babcock and Wilcox reactor and the Nuclear Regulatory Commission would have requested it. If any one of them had done so, they could not have failed to realize that due to a peculiarity of design the probability of a core melt in a Babcock and Wilcox reactor was about an order of magnitude more probable than in a Westinghouse one. After Three Mile Island more scientists had became convinced of the importance and power of the event tree method. The practitioners developed a jargon, and called the procedure Probabilistic Risk Analysis (PRA) and a society Probabilistic Safety Analysis and Management(PSAM) was created to discuss these procedures.([8])

But engineering only covered a part of the whole issue of understanding risks. If the core melts, and the containment vessel fails to hold, then the effect of the release on people involves a very different type of calculation which I now describe. Although this had been discussed by the Atomic Energy Commission and the UN Subcommittee for the Effects of Atomic Radiation (UNSCEAR) with well over 100,000 papers and reports, it was not an engineering calculation.

The important feature of the engineering analysis is that if the design and operation are done correctly, the various steps in an event can be statistically distinct events and the probability of ultimate disaster is the product of a number of (hopefully) small probabilities. It follows that both engineering design and analysis must focus on ensuring the statistical independence. Analysts must also focus on those actions or hazards that destroy the statistical independence such as fire, flood, earthquake, sabotage or terrorism and combinations of these. Fukushima made it abundantly clear that earthquakes and floods are not statistically independent. However it is hard to envisage the coupling between an engineering disaster and the calculated (or measured) effect of a released pollutant on health.

2.4. Risks of Toxic Chemicals and Substances

The effect of pollutants on health is not normally considered by the same people as the engineers who discussed failures of an engineered system. But a systematic approach began to appear about 1970, starting with the effects of radiation and radioactive materials. The majorevent was the establishment of the US Environmental Protection Agency (EPA) followed soon thereafter by counterparts in other countries. For much of the preceding century the USA had been concerned about the safety of food and drugs and the Food and Drug Administration had been established to regulate these. There were two concerns: Acute Toxicity where a person might be poisoned in short order by an unwanted high dose, and chronic effects such as cancer that arise after prolonged exposure to a lower dose. The first concern was 100 years old. Toxic levels had been established for many chemicals, some by bitter experience with people, and some by animal data, mostly on rats and mice.The Environmental Protection Agency was encouraged and authorized to modify this procedure for regulation of environmental pollution.

Obviously the most important information we can get is direct information on the effect a pollutant has upon people. In almost all cases this information comesfrom the unfortunate situations where a group of people have been heavily exposed with disastrous adverse results. Over the years a lot of information has been gained about acute doses. But the chronic effect of lower repeated doses has only been a serious concern for the last 40 or 50 years. As noted earlier, the British Health and Safety Executive were very clear thinking in discussing these matters (2). Also a distinguished physician, Sir Edward Pochin, working at the time for the British Radiological Protection Board, located in the UK Department of Agriculture, wrote a very clear paper including both chemical pollutants and radiation ([9]). In the following ten years until his death, he developed further his concept of the “index of harm”.Unfortunately, the Lowestoft, England, office and laboratory Radiological Protection Board has since been emasculated.