Corporate Social Responsibility has become a hot topic. You can hardly talk about corporate governance these days without touching on it. In some areas, you might as well assume that CSR is in the background of every conversation about governance.
Being a history buff, I began to notice that a lot of “what corporations should be doing” sounded a little like reinventing the wheel. Corporations have, in fact, tried many things over the two centuries since they have risen to prominence. Not all of them have been run by greedy, rapacious bastards. Many of them, in fact, have been run by far-sighted, generous spirits intent on doing good while doing well. I thought that it might be worthwhile to review some of the boldest experiments in business history to see if we could learn any lessons from them.
Here are some early conclusions I have come up with.
The last time that CEOs were routinely kept awake at night was during the merger wave of the mid-1960s.
The frothy ’50s turned out to be high tide for American industrial dominance, a time when we were rebuilding the world after a devastating war. CEOs had it pretty cozy then. As the tide began to recede, investors began to notice the accumulated waste made possible by a decade of easy growth. A few of them saw advantage in taking over the worst governed companies in order to restructure them. At that time, they could do so without warning, which is what made this environment so frightening to CEOs. Imagine never knowing when you might get a phone call telling you that you are out. This is like going trick-or-treating and fearing the “tricks” all year long.
Corporate executives of that period had grown up in a world where being a leader meant getting along with everybody, and knowing how to use the corporate treasury to buy allegiances, including labor, business partners, and politicians. These new people on the scene—called “raiders”—were after the whole treasury, in part to prevent it from being used as the CEO’s relationship kitty. The governance mechanisms of the day gave them access to it by simply taking advantage of the stock being cheap after years of neglect.
Worried incumbent CEOs reacted by contacting their congressmen, who also knew a thing or two about incumbency. This unholy partnership took control of the narrative. Instead of investors identifying bloated companies in order to restructure them and return excess funds to remaining shareholders, the incumbents claimed that:
“In recent years we have seen proud old companies reduced to corporate shells after white-collar pirates have seized control with funds from sources which are unknown in many cases, then sold or traded away the best assets, later to split up most of the loot among themselves.” (Sen. Harrison Williams, 1965)
The media bought it. The legend of the “corporate raider” was born. The off-hand mention of “unknown” funding sources added a hint of nefariousness. (Who did they think provided the funds? Why did it matter?) The media didn’t consider that any mechanism that made the sum of the parts worth so much more than the whole might actually be socially useful. Instead, they played on the conservative discomfort of seeing old line, industrial firms disappearing at the hands of destabilizing (and generally non-WASP) upstarts, and the liberal discomfort of “money men” involved in unregulated financial activities.
Thus, in the fall of 1968, Congress passed the Williams Act. This law prevented investors from making a tender offer for shares without giving incumbent boards and management a chance to “present their case” for continued control of the company—as if they hadn’t already had years to make their case.
Now comes the weird part. Read more of this article »
"More lucky than good"
Consider that you are on the board of a company of adventurers, and one of your captains, Chris Columbus, comes to you with a project.
“The learned people all think that the world is flat. Our R&D indicates that the world is round. If you fund our voyage, we will be able to obtain the wealth of the Indies spice trade via a shorter route from the west instead of from the east.”
One director chimes in, “But Chris, all the experts say that the world is flat, and that the western route is terra incognito. Why don’t you think you won’t simply fall off the edge of the world?”
Chris replies: “The experts are wrong. Here is my data. Fund me, and glory will be ours.”
According to the story we were taught in school, Columbus was not able to get money from the usual channels that might fund a sea voyage, and it was the bold bet by Queen Isabella that launched him toward America and legend. In this story, Isabella was the perspicacious investor behind a brilliant, if misunderstood, CEO. Her good bet paid off in the bounty of a New World, and all those investors who wouldn’t back Columbus were fools who lost out.
What really happened looked like this.
Chris Columbus tells the board: “The learned people say that the world is a very large sphere, about 20,000 to 30,000 miles around. Our R&D indicates that the world is only about a third of that size around, meaning we can more easily obtain the wealth of the Indies spice trade via a shorter route from the west instead of from the east.”
One director chimes in, “But Chris, if the experts are just a little more correct than you are, you will simply run out of supplies on your voyage, and die.”
Chris replies: “The experts are wrong. Here is my data. Fund me, and glory will be ours.”
The real Columbus was not able to convince anyone with any learning that a western route to the Indies was shorter than the eastern route that was already established. He was able to convince a Queen with more wealth than knowledge. She enabled him to set out on what, by all rights, would have been a voyage to oblivion. Just as his supplies were running out, he stumbled upon the New World.
Boards of directors have basically one job—to make good bets and avoid bad ones with their shareholders’ funds. This is a difficult job under the best of circumstances, which necessarily includes incomplete information and a limited range of capabilities, including the normal biases and dynamics of even well functioning groups. On top of that, they must deal with the bane of businessmen everywhere—luck.
In the mythical Columbus story, the adventurer and the Queen made a good bet, and won. In the real story, they made a bad bet and won. The problem for directors is that the world only sees results; it cannot see the quality of the bets that led to them. Columbus’s bad bet was redeemed by an accident of incredible fortune. The Queen appeared to disprove the adage about a fool and her money, and the investors that wisely turned down Columbus’s bet were ridiculed for a missed opportunity.
Obviously, each of us would prefer a board that makes winning bets rather than losing ones on our behalf. But no one can dictate the outcomes of our bets. The only thing we control is the quality of our bets. By definition, good bets are more likely to pay out than bad ones. A board that effectively distinguishes these things should be more effective. But they can still lose. The world isn’t fair.
I have seen well-meaning boards make reasonable bets and end up lambasted on the front page of the Wall Street Journal for looking foolish. I have seen boards that were in over their heads nonetheless feted when the wind found their backs. The business world looks very different to those of us who are on the inside versus those reading the stories. We know that the world is not fair, and deal accordingly.
My Columbus Day message is this: We should always strive to be good, and hope we are also lucky. In life, if not always in business, we are generally blessed with many chances to succeed or fail. Over time, good luck and bad will tend to even out, and the quality of our decisions should show through. Even then, though, luck has a say.
The birth of a new British heir once again causes us governance geeks to scratch our heads at the succession mechanism formally known as primogeniture, the winner-take-all system whereby the first-born (generally male, but not always) becomes heir to substantially all of the parents’ titles and property. In the context of a monarchy, has anyone ever believed that such a mechanism would consistently yield good leaders?
The answer, of course, is “No,” but the question assumes the wrong purpose. In fact, primogeniture did not evolve as a way to select a certain quality of leader; it evolved as a way to enable society to accumulate capital.
For most of history, it was extremely difficult to preserve and grow capital from one generation to the next. Before the 19th Century, the lives of ordinary people–how they labored and what they had in their homes–were virtually indistinguishable from that of their grandparents. Things were hardly better among the aristocracy. For them, accumulated property was basically an invitation to plunder. Consequently, from the Fall of Rome to the Industrial Revolution, the vast majority of capital created by the upper classes was in the form of weaponry, and most of that was consumed in battle. It was in this neo-Hobbesian war of all against all that primogeniture evolved as a way to select kings.
The customary transfer of allegiance of powerful nobles from their king to a royal heir greatly reduced the odds of a civil war. Societies that tended to avoid civil war tended to accumulate far more capital. More capital made them more powerful, economically and militarily, creating a dynamic that eventually led to the institution of monarchical succession via primogeniture spreading throughout most of the world.
Read more of this article »
As we celebrate the birth of our country this week, I think it’s worth reflecting on the United States as history’s most daring experiment in governance.
Most of us were taught the Constitution in middle or high school as a series of clauses defining the various workings of our federal government. Some concepts such as “checks and balances” managed to penetrate our pubescent fog because the idea of constraints on authority is innately appealing to young people. But few of us were left with a sense of how bold an innovation our Constitution was at the time of its adoption, or how fragile was the republic that it created. Understanding those things greatly enhances one’s appreciation of the American civilization that would emerge from that experiment.
Read more of this article »
Numbers don’t tell the whole story. In fact, they don’t tell any story, which is why people are generally less interested in numbers than they are in the narratives that form around them. Steve Job’s life is full of amazing stories about form+function, mechanical beauty, and inspiration. An alien could add up the deluge of accolades in the 24 hours since his passing and infer, with good reason, that Steve Jobs was important to humanity.
But someone could also reach that conclusion by crunching a few numbers so far invisible among the accolades. Those numbers net out to $364 billion. That would be the increase in market capitalization of Apple over the times Jobs led that firm, plus the market values of Pixar and NeXT at the times he sold them, net of the relatively negligible amounts he put into them. In a narrative form, we would refer to those as the market value added of Steve Jobs.
To be sure, Jobs didn’t create that value all by himself. He had a lot of co-investors, work colleagues, and not a few Chinese factory workers helping him. Still, his vision, creativity, and energy put it all together, creating those immense opportunities for all those he led, as well as for the millions of consumers who looked at his company’s creations and went “Wow.”
Even net of the contribution of all his partners in enterprise, at an estimated net worth of $7 billion, Jobs probably captured only a tiny fraction of the value he created for society. I mention that not to suggest he was under-compensated, but to suggest that the same is probably true for innumerable other entrepreneurs with far less fame than Jobs.
But Jobs has one unique distinction among all entrepreneurs, or, for that matter, among any builder, mogul, or monarch: The enterprises under his leadership created more value than any others in history.
The fact that he did this in 56 years is today’s tragedy; what has the future lost by the random cutting of an extra 20 years of his extraordinary life–a period when many business leaders are hitting their stride?
Reading about governance in the first villages of New England, I come across lessons that keep getting repeated, down to our time. In 1641 the English Civil War triggered an economic crisis across the ocean. It threatened to disrupt relationships and supplies from the mother country upon which the colony depended, causing a number of the colonists to either sail back to England or move south where they could create a better subsistence for themselves.
The resulting turmoil caused a sharp drop in the price of land and commodities. At the same time, with labor getting scarce, workmen were able to ask for much higher wages. Relatively larger landholders found themselves in an economic vise. So, the town fathers, made up principally of these larger landholders, decided to pass wage regulations, limiting how much workmen could charge for their labor. Here’s a sample of those rules:
Every cart, with four oxen, and a man, for a day’s work 5s.
All carpenters, bricklayers, thatchers 21d./day
All common laborers 18d./day
All sawyers, for sawing up boards 3s./4d. per 100
All sawyers for slit work 4s./8d. per 100
The grandees who argued in favor of this price list no doubt justified it by arguing (a) it was for the public good, (b) no worker should profit from economic turmoil, (c) no one was worth 10 s. per day, (d) it’s good to spread the pain, and (e) given that these limits would be imposed on relatively poor people living at the edge of civilization between an inhospitable wilderness and a gaping ocean, “where else could they go?”
For you economics majors out there, what was the predictable outcome of these wage controls?
Read more of this article »
The IMF is pushing for a bank tax:
[T]o pay for the costs of winding down troubled financial institutions, the IMF proposed what it called a Financial Stability Contribution”—a tax on balance sheets, including “possibly” off-balance sheet items, but excluding capital and insured liabilities. That tax would seek to raise between about 2% to 4% of GDP over time—roughly $1 trillion to $2 trillion if all G-20 countries adopted the tax.
On top of that, the IMF proposed that nations to adopt what it called a Financial Activities Tax, levied on the sum of profits and compensation of financial institutions. That would be paid to a nation’s treasury to help finance the broader costs of a financial crisis…
The IMF said that a nation didn’t need to put in place a specific resolution authority. Instead, the tax money could go to general revenues and used in case of financial crisis. But the IMF warned that the money would be spent by the time a problem arose.
OK, so let’s see how this would work. Congress levies massive new taxes on every major bank. Congress would then spend that money on…stuff. A financial crisis hits, and certain TBTF banks get into trouble. Congress bails them out, having to borrow gobs of money to do so because the tax revenues that were nominally for “Financial Stability” were in fact spent on…stuff.
So, how is this different from what happened last time? Hard to see. Does it do anything to reduce the systemic risks that regulators insist were at the root of the last crisis? No. Does it strengthen the banks to make them better able to weather such a crisis? Not likely when so much money of their capital–enough to raise between 2% to 4% of GDP–is being sucked out of their coffers. At least if the money were being held in a trust fund instead of dumped into general revenues, it would be there for frenzied politicians to disburse based on the rational workings of the government. But, of course, the money will not be there. It will have been spent not to support the financial system, but to support the reelection of incumbent politicians–the most short-term actors on the planet.
Oh. Yeah. THAT would be the difference.
So the lesson from all this appears to be: When it comes to a justify raising taxes, any excuse will do.
The top marginal tax rate in post-war America on income over $400K was so high that anyone making large, but lumpy income would have a strong incentive to insure that the lumps were spread out across tax years:
The 1950s was the era of the 90 percent top marginal tax rate, and by the end of that decade live gate receipts for top championship fights were supplemented by the proceeds from closed circuit telecasts to movie theaters. A second fight in one tax year would yield very little additional income, hardly worth the risk of losing the title. And so, the three fights between Floyd Patterson and Ingemar Johansson stretched over three years (1959-1961); the two between Patterson and Sonny Liston over two years (1962-1963), as was also true for the two bouts between Liston and Cassius Clay (Muhammad Ali) (1964-1965). Then, the Tax Reform Act of 1964 cut the top marginal tax rate to 70 percent effective in 1965. The result: two heavyweight title fights in 1965, and five in 1966. You can look it up.
Of course, tax-driven behavior continues to create unintended consequences. In a lecture I gave today in Switzerland, I pointed out how the U.S. government’s elimination of tax deductibility of salaries over $1 million created a growing shift in the mix of executive pay from salary toward bonuses and equity. The mix went from about 70/30 (salary versus bonuses/equity) before the tax law to about 10/90. This change in the mix of pay contributed significantly to the huge growth in total CEO pay we saw in the ensuing ten years. And that is how American tax policy intended to reduce CEO pay actually led to its increase.
Something about other people’s high pay just drives congressmen a little nuts.
Hat tip: Marginal Revolution
California politicians and the UAW are loudly berating Toyota for its decision to close their NUMMI plant in Fremont, California. Most of the media is piling on, with the liberal commentariat pronouncing “treachery” and “ingratitude” for all that California customer’s and workers have given to Toyota over the years, as if there were something other than a commercial exchange between them. The simple reality is that Toyota is making a business decision. It’s Fremont plant is not making money anymore, especially after GM pulled out of its part of it. The commentariat insists it’s never that simple, and is spinning all manner of anti-business narratives out of this decision.
Ok, let’s go with “it’s not that simple.” Only, my narrative won’t assume that Toyota is just there paying workers and suppliers and tax authorities, as if the plant’s existence were a given. My narrative will begin at the beginning, before Toyota moved into Fremont.
Read more of this article »