New York Review of Books, Volume 54, Number 13 · August 16, 2007
They're Micromanaging Your Every Move
By Simon Head
The Social Life of Information
by John Seely Brown and Paul Duguid
Harvard Business School, 320 pp., $25.95
Bait and Switch: The (Futile) Pursuit of the American Dream
by Barbara Ehrenreich
Owl Books, 248 pp, $13.00 (paper)
The Culture of the New Capitalism
by Richard Sennett
Yale University Press, 214 pp, $25.00
The digital revolution of the 1990s seemed to mark a definitive break with the manufacturing economy that had thrived in the United States since the late-nineteenth century. With the pervasive use of information technology (IT) by banks, insurance companies, hospitals, clinics, even warehouses and retail stores, the era of industrial mass production in the United States faded into the past. Also redundant were the blue-collar workers who had manned the old assembly lines. With 80 percent of the American workforce now employed in white-collar service industries, economists assumed that there was no longer any need for a large industrial proletariat with limited skills, passively taking orders from above.
In the years since the long economic boom of the 1990s came to an end in 2000–2001, there has been growing evidence that this view of recent economic history is flawed. In fact, the findings of the three books under review here, along with much recent research, suggest that methods of production based on top-down standardization and tight control of work are as influential in the digital economy as they were in the industrial economy. Drawing upon the virtually unlimited powers of computers to monitor the activities of employees and their use of information, these methods have simply been readapted for the white-collar workplace.
What is striking is how they have been used in ways that put skilled workers in many professions at a disadvantage. In an economy more and more populated by "knowledge workers"—people who work primarily with information, for which they develop special skills and expertise—one would expect the productivity, or output per person, and real income of employees to move upward together, as an increasingly skilled workforce benefits from its own improved efficiency. But since 1995, the year when the "new economy" based on information technology began to take off, incomes have not kept up with productivity, and during the past five years the two have spectacularly diverged. Between 1995 and 2006, the growth of employee productivity exceeded the growth of employee real wages by 340 percent. Between 2001 and 2006, the first six years of George W. Bush's presidency, this gap widened alarmingly to 779 percent.[1]
The gap helps explain why Wal-Mart casts such a long shadow over the US economy. Wal-Mart has demonstrated the effectiveness of applying industrial principles to the retail economy. It does so by combining an intensive use of information technology, a rapid growth of employee productivity, and a harsh, often punitive work regime that keeps even the most productive workers off balance and their wages at poverty levels. Studies have shown, for example, that the productivity of Wal-Mart employees has been as much as 41 percent higher than that of the company's competitors, yet shop floor workers are paid far less than at other discount stores. According to researchers at the Union of Food and Commercial Workers inWashington, the average hourly wage of UFCW members at the unionized Safeway, Albertson, and Kroger supermarkets in California is $12.71; the comparable figure for Wal-Mart "associates" nationwide around $9.[2]
The Wal-Mart approach is being driven by technologies known as "enterprise systems," or ES, which bring together computer hardware and software to standardize and then monitor the entire range of tasks being done by a company's workforce. The industry leaders in these systems are the Big Three of corporate computing: IBM, SAP, and Oracle. Also prominent are management consultants such as Gartner and Accenture who advise corporations and public bureaucracies about which ES system to buy.
Corporate reliance on ES technology grew throughout the 1990s; by the year 2000 the Boston consultants AMR Research could write that most companies considered its use "as part of the cost of doing business, a necessary part of the organization's infrastructure."[3] Among manufacturers, wholesalers, and retailers like Wal-Mart, ES offers obvious economic advantages. It relies on electronic tags, sensors, and "smart" chips to identify goods and components at different stages of the production and distribution chain, a practice that has brought enormous gains in productivity. Such innovations allow managers to find out immediately not only that production and distribution are falling behind schedule, but also why. Equipped with a flow of detailed, up-to-the-minute information about the status of a particular person or object in the supply chain, managers can "drill down"—a key phrase in the ES world—and immediately find the source of error: a bottleneck at a warehouse in Kansas City; a dysfunctional work team on the line in Detroit; a parcel sent to Portland, Oregon, when it should have been sent to Portland, Maine.
But during the last ten years ES has been more and more applied to complex white-collar businesses, public bureaucracies, and even universities. How can an automated regime that was designed to control the production and distribution of automobiles or VCRs be used to regulate the treatment of sick patients, the teaching of students in schools, the conversations between sales agents and customers, and the decisions to hire and fire employees? The answer is that ES technologies are able to reduce these complex human activities and reasoning to a series of processes and outcomes that can be mapped out and programmed by a computer.
Nowhere have these technologies been more rigorously applied to the white-collar workplace than in the health care industry. The practices of managed care organizations (MCOs) have provided a chilling demonstration of how enterprise systems can affect the work of even the most skilled professionals, in this case the physician. The goal is to standardize and speed up medical care so that insurance companies can benefit from the efficiencies of mass production: faster treatment of patients at reduced cost, with increased profits earned on increased market share.
In the mid-1990s MCOs relied heavily on a procedure known as "utilization review" to contain costs and standardize treatments. Case managers without medical training, relying on guidelines often derived from proprietary databases, ruled on whether a requested treatment would or would not be paid for. This micromanagement of doctors' diagnostic reasoning provoked such an outcry from patients and physicians alike that in 2000 leading MCOs such as United Healthcare and Aetna announced that they were giving up such reviews and freeing doctors from administrative control.[4]
But there has been less to this liberation of physicians than meets the eye. Doctors I first interviewed ten years ago now say that MCO case managers simply interfere with their decision-making after rather than before treatment decisions are made. The same case managers, armed with the same guidelines, contest whether procedures such as MRIs and CAT scans have been done according to strict and detailed guidelines, then deny or reduce payments by alleging that the tests don't meet their standards. Physicians must still employ full-time assistants whose sole task is to wrangle with MCOs over the minutiae of payments and treatments.
A second and cruder method used by MCOs to enforce medical industrialization is to keep payments to doctors for patient visits so low that doctors must dramatically increase the number of patients they see to cover their overhead. In his 1999 book on the history of medical education, Time To Heal, Dr. Kenneth Ludmerer chronicles the acceleration of "patient throughput." During the late 1980s most physicians felt that examining thirty patients a day was "pushing the limit." But by the mid-1990s many MCOs required doctors to see twenty-five to thirty patients a day and some primary care physicians reporting having to treat "as many as seventy patients a day." Doctors spent an average of eight minutes talking to each patient, less than half as much as a decade earlier. Ludmerer told me recently that he believes the pressure to increase patient "throughput" has continued to increase over the past six years.[5]
The managed care approach has been poorly suited to the realities of illness. For example, research published in 1999 by the Journal of the American Medical Association showed that for-profit health care providers that relied on this kind of standardization, such as Aetna and Humana, performed significantly worse than their counterparts in the treatment or prevention of cancer, diabetes, and heart disease.[6] But many of these health care companies think that ES technologies have made them profitable, and it seems unlikely that these practices will be discarded anytime soon.
In The Culture of the New Capitalism, a book based on a series of lectures given at Yale in 2004, Richard Sennett describes how the widespread use of enterprise systems has given top managers much greater latitude to direct and control corporate workforces, while at the same time making the jobs of everyday workers and professionals more rigid and bleak. The call centers of the "customer service" industry, where up to six million Americans work, provide an egregious example of how these workplace rigidities can make life miserable for employees.[7]
At call center companies such as AmTech and TeleTech, call center companies to whom many corporations outsource their "customer relations management," agents must follow a script displayed on their computer screens, spelling out the exact conversation, word for word, they must follow in their dealings with customers. Monitoring devices track every facet of their work: minutes spent per call, minutes spent between calls, minutes spent going to the bathroom. At the same time managers can speed up or reconfigure this digital assembly line simply by throwing a switch and reprogramming the software—specifying less time per call and between calls—much as Henry Ford controlled the line at his Detroit plants in the 1920s.
The most powerful passages in Sennett's book describe how these unnerving changes are destroying aspects of white-collar employment that he believes are essential to the well-being of workers, whether they are nurses, call center agents, bank officers, or mid-level managers at Con Edison. He describes how the spread of ES has resulted in a declining emphasis on creativity and ingenuity of workers, and the destruction of a sense of community in the workplace by the ceaseless reengineering of the way businesses operate. The concept of a career has become increasingly meaningless in a setting in which employees have neither skills of which they might be proud nor an audience of independently minded fellow workers that might recognize their value.
"An organization in which the contents are constantly shifting," Sennett writes of the new-model corporation,
requires the mobile capacity to solve problems; getting deeply involved in any one problem would be dysfunctional, since projects end as abruptly as they begin.... "I can work with anyone" is the social formula for potential ability. It won't matter who the other person is; in fast-changing firms it can't matter. Your skill lies in cooperating, whatever the circumstances....
As we have seen, in the workplace [these changes] produce social deficits of loyalty and informal trust, they erode the value of accumulated experience. To which we should now add the hollowing out of ability.
Of course this is not true of the new elite of senior managers, consultants, and software engineers who design, install, and maintain the technologies that are now used to manage workforces. Their skills are in heavy demand, and they are well paid. But Sennett is concerned about the people who work within the systems they devise.
Sennett does not pose the question whether companies run according to ES systems will be innovative, productive, fast-growing, and profitable. The McKinsey Global Institute's authoritative study of US productivity growth between 1995 and 2000, the golden years of the digital revolution and the IT boom on Wall Street, found that US productivity gains were very narrowly focused within six of the economy's sixty sectors. Gains were heavily concentrated in the Wal-Mart economy of wholesale, retail, and distribution, and also within the IT industries themselves. But service industries such as health care and banking experienced what McKinsey tactfully describes as "small productivity growth decelerations."[8]
In their book The Social Life of Information John Seely Brown and Paul Duguid explore the limitations of ES in the service economy, and give a fascinating account of how the technologies of ES might be more effectively used there. At Xerox's Palo Alto Research Center (PARC) Brown and a team of researchers developed a technology to support the technicians who repair and service Xerox machines. Like many other large US corporations Xerox tried in the early 1990s to transfer the work practices of the assembly line to the very different world of the service economy. Technicians were supplied with an automated system specifying the various ways in which a Xerox machine could break down, along with the remedy to be followed in each case. The problem was that the machines kept going wrong in ways that the automated system hadn't anticipated; and often the machines developed several problems simultaneously. The technicians responded by throwing away the rule book and relying on their own knowledge. Brown and Duguid's description of the technicians' thought processes is similar to accounts of diagnostic reasoning given in medical textbooks. First examine the patient's symptoms; then consider all the possible causes in order of probability; treat each of these until the right one is discovered and the correct remedy prescribed.
Brown and his team at PARC persuaded Xerox management that these ad hoc practices should be incorporated into a new system called Eureka. Any technician who had solved a difficult case would simply write up his findings and a committee of technicians would decide whether the material merited inclusion in the Eureka database. Since Xerox adopted it, Eureka has achieved spectacular results. In its first three years, it logged 30,000 case histories from employees, and is estimated to have saved Xerox $100 million.[9] Eureka is a striking example of how the old workplace values described by Sennett can be preserved and even enhanced by the use of information technology. For the technicians, Xerox became a more desirable place to work because it recognized and rewarded their own creative contributions, while Xerox's managers found that more employees were loyal and productive.