Corporate Governance in America: A Brief History
Entrepreneurial, Managerial, and Fiduciary Capitalism
In the first part of the twentieth century, large U.S. corporations were controlled by a small number of wealthy entrepreneurs—Morgan, Rockefeller, Carnegie, Ford, and DuPont, to name a few. These “captains of industry” not only owned the majority of the stock in companies, such as Standard Oil and U.S. Steel, but they also exercised their rights to run these companies.
By the 1930s, however, the ownership of U.S. corporations had become much more widespread. Capitalism in the United States had made a transition from entrepreneurial capitalism, the model in which ownership and control had been synonymous, to managerial capitalism, a model in which ownership and control were effectively separated—that is, in which effective control of the corporation was no longer exercised by the legal owners of equity (the shareholders) but by hired, professional managers.
With the rise of institutional investing in the 1970s, primarily through private and public pension funds, the responsibility of ownership became once again concentrated in the hands of a relatively small number of institutional investors who act as fiduciaries on behalf of individuals. This large-scale institutionalization of equity brought further changes to the corporate governance landscape. Because of their size, institutional investors effectively own a major fraction of many large companies. And because this can restrict their liquidity, the de facto may have to rely on active monitoring (usually by other, smaller activist investors) than trading. This model of corporate governance, in which monitoring has become as or more important than trading, is sometimes referred to as fiduciary capitalism.[1]
The 1980s: Takeovers and Restructuring
As the ownership of American companies changed, so did the board-management relationship. For the greater part of the 20th century, when managerial capitalism prevailed, executives had a relatively free rein in interpreting their responsibilities toward the various corporate stakeholders and, as long as the corporation made money and its operations were conducted within the confines of the law, they enjoyed great autonomy. Boards of directors, mostly selected and controlled by management, intervened only infrequently, if at all. Indeed, for the first half of the last century, corporate executives of many publicly held companies managed with little or no outside control.
Example 2.9 – Modern Takeovers
In a recent takeover case, Walt Disney Co. announced it would take over 21st Century Fox at the price of $71.3 billion. The deal unites both companies Marvel franchises and adds Fox’s Deadpool to Disney’s Star Wars in addition to Fox television, FX Networks, and National Geographic. The now smaller Fox Corp. will operate as a stand alone company and retain ownership of a broadcast network and its affiliates as well as Fox News, Fox Business, and Fox Sports. Disney hopes to position itself against increasing competition from Netflix, Amazon, and possibly Apple. Unfortunately, as Disney consolidates its properties thousands of people are likely to lose their jobs.
Source: NPR.org, Disney Officially Owns 21st Century Fox, Zeqing Liu, 2019Sp
In the 1970s and 1980s, however, serious problems began to surface, such as exorbitant executive payouts, disappointing corporate earnings, and ill-considered acquisitions that amounted to little more than empire building, and depressed shareholder value. Led by a small number of wealthy, activist shareholders seeking to take advantage of the opportunity to capture underutilized assets, takeovers surged in popularity. Terms, such as leveraged buyout, dawn raids, poison pills, and junk bonds, became household words, and individual corporate raiders, including Carl Icahn, Irwin Jacobs, and T. Boone Pickens, became well known. The resulting takeover boom exposed underperforming companies and demonstrated the power of unlocking shareholder value.
Of lasting importance from this era was the emergence of institutional investors who knew the value of ownership rights, had fiduciary responsibilities to use them, and were big enough to make a difference.[2] And with the implicit assent of institutional investors, boards substantially increased the use of stock option plans that allowed managers to share in the value created by restructuring their own companies. Shareholder value, therefore, became an ally rather than a threat.[3]
The Meltdown of 2001
The year 2001 will be remembered as the year of corporate scandals. The most dramatic of these occurred in the United States—in companies such as Enron, WorldCom, Tyco, and others—but Europe also had its share, with debacles at France’s Vivendi, the Netherlands’ Ahold, Italy’s Parmalat, and ABB, a Swiss-Swedish multinational company. Even before these events fully unfolded, a rising number of complaints about executive pay, concerns about the displacement of private-sector jobs to other countries through offshoring, and issues of corporate social responsibility had begun to fuel emotional and political reactions to corporate news in the United States and abroad.
Most of these scandals involved deliberately inflating financial results, either by overstating revenues or understating costs, or diverting company funds to the private pockets of managers. Two of the most prominent examples of fraudulent “earnings management” include Enron’s creation of off–balance sheet partnerships to hide the company’s deteriorating financial position and to enrich Enron executives and WorldCom’s intentional misclassification of as much as $11 billion in expenses as capital investments—perhaps the largest accounting fraud in history.
The Enron scandal came to symbolize the excesses of corporations during the long economic boom of the 1990s.[4] Hailed by Fortune magazine as “America’s Most Innovative Company” for six straight years from 1996 to 2001, Enron became one of the largest bankruptcies in U.S. history. Its collapse in December 2001 followed the disclosure that it had reported false profits, using accounting methods that failed to follow generally accepted procedures. Both internal and external controls failed to detect the financial losses disguised as profits for a number of years. At first, Enron’s senior executives, whose activities brought the company to the brink of ruin, escaped with millions of dollars as they retired or sold their company stock before its price plummeted. Enron employees were not so lucky. Many lost their jobs and a hefty portion of retirement savings invested in Enron stock. Because the company was able to hide its losses for nearly five years, the Enron scandal shook the confidence of investors in American governance around the world.
Outside agencies, such as accounting firms, credit rating businesses, and stock market analysts had failed to warn the public about Enron’s business losses until they were obvious to all. Internal controls had not functioned, either. And Enron’s board of directors, especially its audit committee, apparently did not understand the full extent of the financial activities undertaken by the firm and, consequently, had failed in providing adequate oversight. Some experts believed that the federal government also bore some responsibility. Politicians in both the legislative and executive branches received millions of dollars in campaign donations from Enron during the period when the federal government decided to deregulate the energy industry, removing virtually all government controls. Deregulation was the critical act that made Enron’s rise as a $100 billion company possible.
In June 2002, shortly after the Enron debacle, WorldCom admitted that it had falsely reported $3.85 billion in expenses over 5 quarterly periods to make the company appear profitable when it had actually lost $1.2 billion during that period.[5] Experts said it was one of the biggest accounting frauds ever. In its aftermath, the company was forced to lay off about 17,000 workers, more than 20% of its workforce. Its stock price plummeted from a high of $64.50 in 1999 to 9 cents in late July 2002 when it filed for bankruptcy protection. In March 2004, in a formal filing with the SEC, the company detailed the full extent of its fraudulent accounting. The new statement showed the actual fraud amounted to $11 billion and was accomplished mainly by artificially reducing expenses to make earnings appear larger. After restructuring its debt and meeting other requirements imposed by a federal court, the company emerged from bankruptcy protection in April 2004 and formally changed its name to MCI Inc.
Even as it emerged from bankruptcy, industry observers anticipated that MCI would need to merge with another telecommunications firm to compete against larger companies that offered a broader range of telecommunications services. The merger materialized less than a year later, in February 2005, when Verizon Communications Inc. announced its acquisition of MCI for about $6.7 billion in cash, stocks, and dividend payments. MCI ceased to exist as an independent company under the terms of the merger, which was completed in 2006.
As Edwards (2003) notes, these scandals raised fundamental questions about the motivations and incentives of executives and about the effectiveness of existing corporate governance practices, not just in the United States, but globally. What motivated executives to engage in fraud and earnings mismanagement? Why did boards either condone or fail to recognize and stop managerial misconduct and allow managers to deceive shareholders and investors? Why did external gatekeepers, for example, auditors, credit rating agencies, and securities analysts, fail to uncover the financial fraud and earnings manipulation, and alert investors to potential discrepancies and problems? Why were shareholders themselves not more vigilant in protecting their interests, especially large institutional investors? What does this say about the motivations and incentives of money managers?[6]
Because of the significance of these questions and their influence on the welfare of the U.S. economy, the government, regulatory authorities, stock exchanges, investors, ordinary citizens, and the press all started to scrutinize the behavior of corporate boards much more carefully than they had before. The result was a wave of structural and procedural reforms aimed at making boards more responsive, more proactive, and more accountable, and at restoring public confidence in our business institutions. The major stock exchanges adopted new standards to strengthen corporate governance requirements for listed companies; then Congress passed the Sarbanes-Oxley Act of 2002, which imposes significant new disclosure and corporate governance requirements for public companies, and also provides for substantially increased liability under the federal securities laws for public companies and their executives and directors; and the SEC adopted a number of significant reforms.
The Financial Crisis of 2008
Just as investor confidence had (somewhat) been restored and the avalanche of regulatory reform that followed the 2001 meltdown digested, a new, even more damaging crisis, global in scale and scope, emerged. While it has not (yet) been labeled as a “corporate governance” crisis, the “financial crisis of 2008” once again raises important questions about the efficacy of our economic and financial systems, board oversight, and executive behavior.
Specifically, as the economic news worsened—rising inflation and unemployment, falling house prices, record bank losses, a ballooning federal deficit culminating in a $10 trillion national debt, millions of Americans losing their homes, a growing number of failures of banks and other financial institutions—CEOs, investors, and creditors are walking away with billions of dollars, while American taxpayers are being asked to pick up the tab (Freddie Mac’s chairman earned $14.5 million in 2007; Fannie Mae’s CEO earned $14.2 million that same year). Ordinary citizens who had seen the value of their 401K investment plans shrink by 40% or more.
Example 2.10 – Financial Regulation
Since 2002, companies in the United States have had to comply with the Sarbanes Oxley Act which requires that executives personally certify their company’s accounts. If the executive misrepresents something they are subject to criminal penalties. External auditors — companies other than ones that maintain record of accounts — must certify the validity of the books.
Source: The Balance, Do Regulations Keep Your Money Safer?, 2019Wi
Certainly, the 2008 crisis was complicated and even ten years later, answers remain evasive. Many believe bankers, central banks, and regulators did indeed mishandle the crisis by failing to foresee the coming disaster and for failing to impose checks and balances on financial institutions. Following the repeal of Glass-Steagall (a rule that separated commercial and investment banking) in 1999, banks needed to find ever more risky vehicles for growth. Many gave risky mortgage loans to people who could not afford them. The central bank ignored warnings signs as the emerging disaster was building up. Regulators were understaffed and overwhelmed to the point where they could not read what was available to them.
While the causes of the crisis will be debated for some time—Did we rely too much on free markets or not enough? Did special interests shape public policy? Did greed rule once again? Where were the boards of Bear Stearns, Lehman Brothers, and AIG? Were regulators asleep at the wheel? Incompetent? One thing is for sure, another wave of regulatory reform—this time possibly global in reach—is around the corner. And once again we will be asking those familiar questions: What will be the impact on investor confidence? On corporate behavior? On boards of directors? On society?