ADDING VALUE THROUGH CORPORATE GOVERNANCE

Robert A.G. Monks

May 2010

The Vision

The modern business corporation emerged as the first institutional claimant of significant unregulated power since the nation state established its title in the sixteenth and seventeenth centuries.

—Abe Chayes[i]

Abe Chayes, a former Kennedy administration official and long-time Harvard Law professor, wrote those words at the outset of what might be thought of as America’s own “Thirty Glorious Years” — that three-decade span from the late seventies through 2008 when it seemed possible that private enterprise could operate on a global stage, free from the constraints of governmental regulation and oversight. The vision was simple and stirring, and in many ways irresistible: Corporate efficiency could co-exist with democracy; value will be maximized in a self regulatory regime.

Writing in the Stanford Law Review, another professor, David Engel,[ii] precisely articulated the standards to which corporations would need to subscribe in order to legitimate this unregulated power within a democratic society:

• Disclose fully the impact of their operations on society

• Obey the law

• Restrain their impact on government– both elections and administration

Today, we are surrounded by the wreckage of this seemingly noble experiment. ”Self-restraint” proved largely to be no restraint. Rather than legitimatize the power handed them, corporations have insured the ultimate need for involvement of government and the end of the dream.

There is, however, a silver lining in all this. The financial crisis of 2007-2010 has created the rare political climate in which one can realistically suggest pre-emptive federal action, particularly to redress the unintended consequences of earlier actions. And the Supreme Court’s recent decision in the Citizens United case has created not only a compelling need to do so but an inherent logic in how to proceed. First, though, some background.

The History

Government and the Corporation

The Constitution does not mention the word corporation, but suspicion of centralized corporate power was an early part of the American political landscape, culminating in President Jackson’s refusal to recharter the Second Bank of the United States. By the middle of the 19th century, corporate charters were available as of right, but corporations were limited as to size, purpose, and tenure. Power, though, was beginning to drift the corporation’s way. By insuring that states would always compete for charters through ever-more diluted restrictions, the federal system resulted in unrestricted chartering of corporations by the end of the nineteenth century. In 1886, in an important aside, the Supreme Court concluded that corporations were “persons” entitled to the protections of the Fourteenth Amendment to the constitution and, therefore, legal participants in the political life of the country.

Corporate power continued to be a critical element of disagreement between the political parties in the presidential elections of the early 20th century. In general, Republicans backed Wall Street and big business, while Democrats sought to outlaw monopolies and unfair practices. But Theodore Roosevelt consistently battled the “trusts,” and in his 1912 campaign (running now under the banner of the Progressive Party, rather than as a Republican), Roosevelt recommended strong federal regulation to offset corporate power. Justice Louis Brandies concurred. Large corporate power cannot be controlled from without, Brandeis argued: “We believe that no methods of regulation ever have been or can be devised to remove the menace inherent in private monopoly and overweening commercial power.”[iii]

As ownership of the great enterprises passed from the “robber barons” to their heirs and the general public, Wall Street and financialization became the critical factors in corporate governance. Transactions affecting corporate capital and control generated fees, which became the informing energy of the American capitalist system, a situation that has prevailed into the 21st century.

In the earliest days of financialization, control over corporations was largely exercised by the respected leaders of the banking houses, epitomized by J.P. Morgan, but this soon became “control” only in the loosest sense. In 1932, Adolph Berle and Gardiner Means vivisected the modern corporation and found a virtually omnipotent management and an impotent shareholdership. A quarter century of unparalleled corporate law reform followed, and the American business corporation emerged from World War II still bearing the stains of fifteen years of Depression and disgrace.

For corporations, what might have been the worst of times proved instead to be the beginning of the best of them. Industrial production, which had built up to historically high levels in order to “win the war,” barely missed a beat as the “industrial military complex” expanded to win the peace, consumer appetites swelled after years of price controls and rationing, and foreign competition lay either destroyed or dormant. Thus, US global hegemony coupled with domestic oligopoly allowed the rarity of shared prosperity to the extent that corporate CEOs could be thought of as “philosopher kings.” This dimmed any serious analysis of the governance of corporations.

Who Owns Corporations?

In 1958, Joseph Livingston surveyed the lot of the shareholder in this reformed world — a world of SEC regulation, extensive disclosure requirements, elaborate proxy machinery, Stock Exchange self-discipline, corporate Good Citizenship, People's Capitalism, and Corporate Democracy. His finding? Exactly what Berle and Means had discovered a quarter century earlier: A virtually omnipotent management and an impotent shareholdership.

These findings, of course, will not surprise today’s readers. Over the succeeding decades, corporate failures – both to learn and to conform with societal standards – have repeated themselves with almost metronomic regularity. Every decade, it seems, government predictably considers and often passes legislation: in the late seventies, the Foreign Corrupt Practices Act; in the eighties, the Corporate Democracy Act (which, in fact, did not pass); in the early years of this century, Sarbanes Oxley; and most recently, Barney Frank and Chris Dodd’s almost Quixotic struggle to produce a pre-election reform bill.

Whatever the supposed cure-of-the-moment, the result is highly predictable: Public concern diminishes, the lobbies flourish, and the cycle starts again. Criminal malefactions dot almost every decade – General Electric and Westinghouse in the electric company conspiracies; Armand Hammer and George Steinbrenner for violation of election contribution laws, Charles Keating and the S&L crisis, Ivan Boesky and Michael Milkin in the eighties, WorldCom and Enron in the early years of this decade, and the finance sector as a whole in the new century. Outrage invariably follows. The talking heads become screaming ones, but in the end, human nature appears to triumph over all manner of controls.

A Fundamental Flaw

The various reform efforts chronicled above were informed by a belief that shareholders elect directors, that directors have a fiduciary duty to protect the shareholders’ interest, and that management is accountable to the board. It has long been clear — and now it is demonstrably so — that each of these premises is probably false.

By extension, then, it follows that the American-style “trente glorieuses” — as the 30-year, post-war rise of the French economy became known — was also fundamentally flawed. It took as its assumption, as we have seen, that a corporation unfettered by any regulation — governmental or ownership — was the best corporation. The reality, as we have also seen, is that self-restraint in the corporate mind frame yields no restraint. So how did we get there? Incrementally, it turns out, and often with the best of intentions. To cite only a few of the highlights:

• In 1971, while Lewis Powell was still a prominent lawyer advising the US Chamber of Commerce of the need to organize a business response to its diminished state, he was appointed and confirmed a Justice of the Supreme Court of the United States. From that “bully pulpit,” he effectively and discretely continued to provide the informing core for the emerging corporate hegemony.

• Inspired by Powell’s appointment and advocacy, conservatives organized the Federalist Society to screen, educate and ultimately to approve of appointments to the Federal bench. The new corporate energy also created the Heritage Foundation and revived the American Enterprise Institute, assuring a consistent and continuing conservative voice.

• This wasn’t just a Conservative revolution, however. A pervasive sense of frustration with federal involvement in business — a belief that government is the problem, not the solution— led Democratic President Jimmy Carter to deregulate the airlines, and more deregulation quickly followed.

• In the UK, Margaret Thatcher denationalized every business she could get her hands on and sent a message to continental Europe.

• Back in the U.S. and with momentum on his side, new president Ronald Reagan fired the air traffic controllers, confirming the irrelevance of organized labor, and the rout was on. Reagan handed the ball off to George H.W. Bush, who had served as the deregulation point man while Vice President. Bush passed it on (reluctantly) to Bill Clinton, who had no stomach for a populist crusade against Big Business; and Clinton yielded to George W. Bush, who had not fallen far from his father’s tree, at least respecting corporate power.

Regulation didn’t cease and desist during these years. The administration of Richard Nixon, for example, imposed multiple new standards affecting particular aspects of corporate impact on society – the environment and product safety. Under Reagan and others, Big Business even showed that it was not against all regulation. Indeed, many of the most expensive and daunting licensing practices have been quietly supported by the major companies since the existence of these regulations effectively blocks the emergence of new competition. And in fact the

single consistent element of US government involvement in business over the entire past has been just that: the enforcement of anti-competitive or anti-trust laws.

During Reagan’s administration, enforcement was so slender that major private law firms specializing in anti-trust shut their doors. Major anti-trust cases – those against IBM and Microsoft – ultimately were decided in favor of the defendants because of deep pockets, skillful counsel, and the patience to await a forgiving new Presidential administration.

Even when the financial crisis of 2007-2010 appeared to require aggressive entry by government into failing sectors of the economy, the intrusion was done largely on corporate terms. Both political parties participated in “bailouts” of financial institutions, automobile companies, and GSEs (government sponsored enterprises), especially Fannie Mae and Freddie Mac. Rather than simply “nationalizing” target companies, the government selectively chose tools deemed appropriate for expressing the public interest in the operation of these companies without, in many instances, making any serious effort to tether the corporate leadership that had helped bring on the crisis.

The government’s “cuckoo egg” like presence in various companies might have humbled some of them, but on the whole, it did little to alter the situation. Ownership was still effectively internal in most publicly held corporations, with the greatest share in the hands of the corporation’s own reigning monarch.

The CEO as Philosopher King

There never has been a strong intellectual justification for the notion of the corporate Chief Executive Officer as a dispenser of public good. Berle, for example, engaged in a spirited public dialogue with Professor Merrick Dodd in the later 1940s and concluded that the CEO as public fiduciary probably was the prevailing reality of the time although he continued to feel that a chief executive’s focus should be on optimizing the corporation’s value.

In recent decades, the debate over the role of CEO’s has broadened to include executive pay, but here again, we are without any persuasive explanation as to why top executives should suddenly earn ten or twenty times more – by all measures of comparison – than they ever did before. I confronted the formidable Lee Raymond at the 2006 ExxonMobil Annual Meeting with this simple question: “What is the justification for paying yourself 16 times more than your predecessor received?” No answer was forthcoming, in large part because no answer was required. Like all annual meetings, at this one all the cards were held by management, and thus shareholder-owners such as myself were supplicants at best, and in reality mostly stage props.

Executive pay, in short, is the symptom. The disease is an executive power that is accountable to no one. No legitimate governance system can be based on the unaccountable power of senior executives, and in the U.S., no existing governance system seems capable of reigning them in, least of all the ones appointed to do the task.

Boards of Directors

Peter Drucker has long raised the question as to whether the current standard of board functioning is so unsatisfactory as to require structural change. Nearly 30 years ago, in The Bored Board, he wrote: “Whenever an institution malfunctions as consistently as boards of directors have in nearly every major fiasco of the last forty or fifty years, it is futile to blame men. It is the institution that malfunctions.”[iv] In the years since, the inability of any portion of the corporate governance structure to deal effectively with holding top management to account — see the discussion of executive compensation, just above — compels the conclusion of continuing systemic board failure.

Simply put, if shareholders cannot hold the CEO accountable for his compensation, how can they assume they exercise effective accountability in any other area? One might ask essentially the same question about the board nostrums commonly put forward: ever larger “majorities,” ever more extensive definitions of “independent.” Why have faith in either solution when the problems persist unchecked?

One definition of madness is repeating the same action and expecting a different result, yet this is madness seemingly without any ready cure. Corporate America has opposed even a scintilla of suggestion that shareholders participate in the nomination of board candidates. Former SEC Chairman Donaldson’s proposal for token shareholder involvement was so plainly unlikely to lead to the election of even a single truly “independent” director that one is bewildered by the violent rhetoric of opposition which persists to this day. The usual reason given is the need to maintain the confidentiality and collegiality of board functioning. Sherlock Holmes would look further, but we need not do so here.

Squaring a Circle

Perhaps, then, it is time to recognize that a fundamental and irreconcilable conflict exists in the perception of what boards should do and how they should be constituted. Our efforts to achieve functionality within the context of the traditional single board can be understood as the metaphoric inability to square a circle. We cannot hope to make progress until — once and for all — we face up to the reality that a self-selecting board cannot ever meet the very real needs for independence at critical points in the governance structure.