On August 14, 2003 Modern Technology in New York City Stopped

Stability and Fragility of the Electric Power Grid in the U.S.

Harin J. Contractor

Scott Duncombe

Troid O. Edwards

Chris Whaley


Executive Summary

Blackouts are mainly caused by two physical features of the electrical grid. The first is the shutting down of electrical lines, caused by the overloading of these lines above their suggested capacity. The second is the grid's reliability, or its ability to maintain a synchronized AC frequency by adequately managing load and supply. As the market becomes deregulated, managing both of these features has become increasingly difficult without increased information on the electrical grid's current load.

Electricity consumers currently pay average instead of marginal prices. Through peak price increases, marginal pricing will decrease electricity demand during peak periods. Such a mechanism allows consumers to purchase cheaper electricity during non-peak periods. It also reduces the capacity strains during peak periods. If correctly implemented, marginal prices can be set so that the electrical grid never reaches full capacity, which will greatly reduce the likelihood of catastrophic blackouts.

The current confusing market structure for electricity that consists of three distinct parts, generation, transmission and distribution is a result of efforts to move from regulated natural monopoly to a deregulated product chain structure. Smart Grid technologies, which encompass SCADA (system control and data acquisition) and AMI (advanced metering infrastructure), offer a practical means to modernize the national grid to increase reliability and energy efficiency. This revolutionary change in customer participation in energy markets is realized through the combination of AMI with personal energy management systems (PEM). Such powerful combinations of these two technologies deliver both cost and consumption information to the end-user in a real-time application that ensure customer participation resulting in reliability and energy efficiency gains.

There are steps and policy recommendations that the Utilities, states, and the U.S. can take to move to a more effective, efficient, and environmentally conscious way to transmit power. SmartGrids are a sensible solution that will not only bring competitive, demand responsive rates, but reliability to a traditionally unstable system. Pushing all states to adopt the Energy Policy Act of 2005 will help clear many of the hurdles to move toward a single grid system and implementation of Smart Meters. Moving toward a Cap and Trade market will give strong incentives on for Utilities and consumers to move to an AMI system with renewable alternatives. Establishing ‘Real Time Pricing’ (RTP) is critical to move toward a SmartGrid system, hence giving providers a guaranteed rate of return and decoupling rates are essential to RTP. This is a long and expensive transition, but benefits are already being seen around the world. Once integrated grids with two way responses are established, energy will have a stable system that saves money and energy from wasteful consumption.


Background

On August 14, 2003 modern technology in New York City stopped. On the tails of a summer heat wave, the electricityused to powerair conditioners, light bulbs, and traffic signals came to a halt. New York was not alone--all along the Northeast, from Detroit to Toronto, the electrical power grid failed for three days. The 2003 Northeast Blackout impacted 50 million people in the United States Canada, and caused over $10 billion in economic losses. As a direct result of the Blackout, 11 people in New York City died, mainly from the unrelenting heat or from an inability to contact emergency personnel. Although looting marred past blackouts, during the 2003 Blackout many New York restaurants served food for free, as their perishable food was going to go bad anyways.

The 2003 Blackout was not a random event. Instead, several technical problems and mistakes lead to the Blackout. The days preceding the Blackout were among the hottest of the year. On the day of the blackout, the temperature in New York City was 93 degrees. On top of their normal uses of electricity, people across the Northeast had their air conditioners on full blast. As a result, there was an exceptionally high demand for electricity on the eastern corridor. As an abnormal amount of electricity flowed through the power lines, they began to heat up as they reached their full capacity. The increased heat caused the lines to elongate andsag. Eventually, a power line operated by FirstEnergy in rural Ohio stretched just low enough to hit an underlying tree, causing the line to go down. In normal circumstances, backup generators would kick in and power from other sources would be diverted to replace the inoperable line. However, because demand was so high, there were no other power sources withthe capacity to increase their production. The backup generators also quickly failed to meet such a large electricity demand. As other lines carried increased loads, they also began to fail anddowned power linescascaded throughout the entire Northeast grid, bringing all electricity distribution to a standstill.[1]

Many people blamed the 2003 Blackout on technical failures and the need for more investment in the electrical grid. Although at the surface, these supply-side and technical failures are the basic cause of the Blackout, they are not the underlying or the most important cause. Mechanical failures are outweighed by the absence of consumer incentives, such as charging more for electricity when demand is high, to reduce electricity consumption during peak periods. However, policy officials discarded methods to operate the current electrical grid more efficiently because electricity would cost more when people demand it most. Of the rejected incentives, the easiest to establish are marginal costs, the real-time price based on the simultaneous costs of production. Instead, policy officials advocated massive spending to “modernize” the failing power grid. However, regardless of how many new power lines regulators build, until new technology that allows consumers to face such incentives, particularly marginal costs for power, blackouts like the 2003 Northeast blackout are inevitable.

The Physics of Failure

There are two fundamental material causes of the 2003 blackout. First there is the physical capacity of electrical power lines, which is a function of the heat generated by those wires by electrical current. Second, there is the problem of the reliability of the interconnected grid. Short term failures, such as overloading of power electrical lines, are compounded by the grid’s poor reliability. To better understand how a smart grid could prevent these types of failures, we first must understand how these breakdowns occur for individual electrical power lines, but more importantly how the structure of the grid propagates these errors.

The North American electrical grid is system of generators and consumers, interconnected by thousands of miles of a copper wiring distribution system. It is the largest single machine on earth, and its complexity exponentially increases as it becomes more interconnected and less centralized. Originally, the grid was built by utilities with vertical monopolies, meaning they owned the power plant, the high voltage, long distance transmission lines, the lower voltage sub transmission lines, and the final distribution network which ultimately delivered power to the user, illustrated in Figure 1. In the last half century, and especially in the last decade, the grid has begun to be opened to deregulate the electrical market. Yet these market changes are not supplemented by upgrades in the transmission facilities, a fact which has dire consequences as the act of delivering becomes increasingly complex.

Figure 1: Shows a simplified version of the Grid. Originally, all of these components were owned and operated by the same entity. Since deregulation, power can be generated, transmitted, and distributed all by different entities.

Deregulation increased congestion on power lines, leading to overloading like that which caused the 2003 Blackout. Prior to deregulation, one power company’s transmission lines would only transmit electrical power owned by that company. That is, a line would transmit electrical power that the company was either actively generating or buying from another utility. Deregulation changed this practice, opening electrical lines for other utilities to buy and sell power through them. This created an incentive for power companies to sell extra capacity in their electrical lines, causing congestion. This congestion meant that the power companies have less spare capacity to handle sudden emergencies, leading to overloading and line failures.

A line failure is caused when electrical lines surpass their capacity, causing the wire to hit a ground and setting off a circuit breaker. Lines are rated depending on temperature, current, and the site’s conditions. As current increases, the power lines heat up exponentially according to the physical relationship I2R (I is current, R is the wire’s resistance). As the lines heat up, the copper expands and therefore the wire sags between electrical poles. If the sagging wire hits something to ground the wire, like a tree, it was cause the wire to draw even more current, triggering a circuit breaker to shut the line down. According to Kirchhoff’s Law, where current flows along all paths in a parallel system, disconnecting one path will increase the current along all other paths, further increasing capacity for the entire grid and possibly triggering more lines to go down.

In the case of the 2003 Blackout, a failure of power line in Ohio created a cascading effect where electrical lines already running at high capacity were overloaded major sources of generation were disconnected from major sources of consumer demand.[1] The failure in Ohio during the summer of 2003, when power usage is at its highest due to air conditioning usage, caused a power company to draw heavily from other generating sources. This caused an increase in the current on several electrical lines which had not been well maintained, which when coupled with the summer’s high temperature caused a wire to sag into a tree and fail. More lines failed as capacity is further distributed on other overstressed lines, cutting generators off of the grid. Severe voltage dips in the State of Ohio, as generators were cut out of the grid, caused Ohio to draw heavily from Michigan, tripping more lines. Faults in Michigan and Ohio took down major transmission lines which were moving power towards the Eastern Seaboard; this caused an incredible imbalance in load and supply, causing more generators to fail and blacking out cities from Cleveland to Ontario to New York City. Running power lines near capacity meant the system was unable to deal with the increased stressed, causing the blackout to cascade across the entire power grid.

This narrative of the 2003 Blackout shows the importance of proper line maintenance and making sure that lines are not run at full capacity, so as to be able to handle sudden changes. But it also reveals the importance of grid reliability, which is maintained by balancing load and supply. As load, or demand, in the Eastern Corridor surpassed supply from Western power sources in Michigan and Ohio, the frequency of the AC Power Source slowed down. The grid’s frequency, 60 Hz, is an important constant of AC power grid. The grid’s structure works to keep this feature constant and is subdivided into several ‘interconnects’ which maintain a highly synchronized frequency. Inside of these interconnects there can be multiple power companies and utilities, all which must stay on within their specific interconnection’s frequency range, which for the Easter Interconnect is 20 mHz.[2] Figure 1 illustrates the frequency boundaries for the Easter Interconnect. The importance of a stable AC frequency, or the reliability of electrical grids, was demonstrated by the 2003 Blackout when a sudden spike in load can threw off the AC frequency and cause vital generators and transformers to shut down, creating further disruptions to the AC frequency.

Figure 2: Frequency Range for the Eastern Interconnection. At a lower frequency, the maximum voltage is delivered at higher rates, whereas at a higher frequency the maximum voltage is delivered at a lower rate.

A key part of maintaining an electrical network’s frequency is balancing load and supply, we will investigate supply first. Supply is managed by using different types of power sources, and generally fit into one of three categories based on their operating cost and responsiveness: Base Load, Intermediate, and Peaking. Base Load generators generally a power source with very low operating cost, generating 60-70% of the annual energy requirements power constantly and only shutting down for maintenance or emergencies due to their long restart time. Intermediate generators tend to be less efficient than Base Load, but require less capital and can easily be ramped up or cooled off to change power output. Intermediate sources account for 20-30% of the annual energy output. Peak generators are used only to cover times of maximum load and can be turned out in less than ten minutes to meet a sudden demand. Peak generation generally accounts for 5% of the total energy requirements for a system. As these three categories indicate, electrical supply is created from a variety of sources which can be called upon a variety of speeds. Power companies must balance the perceived load by creating supply from Base Load, Intermediate, and Peak units.[3]

Table 1: Possible Load Management Techniques[4]
Utility Controlled / Customer Controlled
Supply Side / Demand Side / Backup or Storage / End-Use Modification
Energy Storage / Interruptible Power / Distributed Energy (Or on Peak Generation) / Load Deferral
Power Pooling / Remote Control of Customer Load / Customer Energy Storage / Load Curtailment
-  Voluntary in response to incentives.
-  Or as part of a contractual obligation.

In the last paragraph, we described how supply is created from a variety of generating sources, each with variable abilities to respond to change. But there is a second component to the AC frequency, and that is load; excess supply creates a higher frequency whereas insufficient supply (high load) creates a slower frequency. Power companies lack up to the minute information on the existing load, instead it is estimated based on models which track consumers’ patterns. This lack of information severely limits the action of both power companies and consumers. Power companies take action due to modeling or when they see sudden disturbances at the high voltage transmission levels, severely limiting their possible courses of action. See Table 1 for possible steps companies can take. Most of these solutions, especially on the consumer side, are impossible as consumers have very little knowledge of the grid’s current load conditions, and are generally charged for electricity based on the average load for a larger time period. Even action by the utility is limited to what can be done with supply, as their demand options are limited to cutting off customers. The information lag on consumer load limits preventative action on both the part of the power company and the consumer, leading to more breakdowns.[5]