Most popular posts
- What makes great boards great
- The fate of control
- March Madness and the availability heuristic
- When business promotes honesty
- Due diligence: mine, yours, and ours
- Alligator Alley and the Flagler (?!) Dolphins
- Untangling skill and luck in sports
- The Southeastern Growth Corridors
- Dead cats and iterative collaboration
- Empirical evidence: power corrupts?
- A startup culture poses unique ethical challenges
- Warren Buffett and after-tax returns
- Is the secret to national prosperity large corporations or start-ups?
- This is the disclosure gap worrying the SEC?
- "We challenged the dogma, and it was incorrect"
- Our column in the Tampa Bay Business Journal
- Our letter in the Wall Street Journal
Other sites we recommend
Manage to the rules (only) and you’ll tiptoe right up to the hot red line
The Wall Street Journal recently reported on the splash made by BoE Director of Financial Stability Andrew Haldane at the Federal Reserve’s annual policy conference in Jackson Hole, Wyoming. Haldane argued that regulations become less effective as they become more complex and likened it to a playing Frisbee with a dog. Despite the complexity of the physics involved, catching a Frisbee can be mastered by an average dog because he has to keep it simple:
The answer, as in many other areas of complex decision-making, is simple. Or rather, it is to keep it simple. For studies have shown that the Frisbee-catching dog follows the simplest of rules of thumb: run at a speed so that the angle of gaze to the Frisbee remains roughly constant. Humans follow an identical rule of thumb. Catching a crisis, like catching a Frisbee, is difficult. Doing so requires the regulator to weigh a complex array of financial and psychological factors, among them innovation and risk appetite. Were an economist to write down crisis-catching as an optimal control problem, they would probably have to ask a physicist for help. Yet despite this complexity, efforts to catch the crisis Frisbee have continued to escalate. Casual empiricism reveals an ever-growing number of regulators, some with a Doctorate in physics. Ever-larger litters have not, however, obviously improved watchdogs’ Frisbee-catching abilities. No regulator had the foresight to predict the financial crisis, although some have since exhibited supernatural powers of hindsight. So what is the secret of the watchdogs’ failure? The answer is simple. Or rather, it is complexity. For what this paper explores is why the type of complex regulation developed over recent decades might not just be costly and cumbersome but sub-optimal for crisis control. In financial regulation, less may be more.
Haldane warned that “fundamental limitations of the human mind” thwart increasingly complex (and sometimes frivolous) attempts at regulation. Most involve the limits of data or modelling or even the nature of knowledge itself:
This belief is new, and not helpful. As the authors note, “Many of the dominant figures in 20th century economics—from Keynes to Hayek, from Simon to Friedman—placed imperfections in information and knowledge centre-stage. Uncertainty was for them the normal state of decision-making affairs.”
A deadly flaw in financial regulation is the assumption that a few years or even a few decades of market data can allow models to accurately predict worst-case scenarios. The authors suggest that hundreds or even a thousand years of data might be needed before we could trust the Basel machinery.
Despite its failures, that machinery becomes larger and larger. As Messrs. Haldane and Madouros note, “Einstein wrote that: ‘The problems that exist in the world today cannot be solved by the level of thinking that created them.’ Yet the regulatory response to the crisis has largely been based on the level of thinking that created it. The Tower of Basel, like its near-namesake the Tower of Babel, continues to rise.”
We once made the same point in the context of what makes great boards great: boards who over emphasize the process of good governance, including measures implemented in the wake of previous meltdowns, often fail to foresee the next crisis:
Presumably, those companies and regulatory bodies have boards comprised of accomplished and highly intelligent members with personal wealth at stake. [They were] paying attention and paying consultants; [they had] ethics codes, audit and compensation committees, Independent Directors, regular meetings, well constructed board packages… It’s conceivable that a board member here or there could be corrupt or asleep – but entire boards? Across multiple companies and regulatory agencies? Unlikely. It’s more likely that they were following the current and best practices for strong and effective board oversight. So, if *all* boards have similar formal systems in place, something else must be at work.
A strong board relies on the ‘robust social systems’ among its members – the informal ways in which they trust and challenge each other – to look beyond formal legal and fiduciary responsibilities and proactively assess the shifting regulatory risk environment.
The end of every boom-bust cycle includes a fin de siècle scandal: insider trading punctuated the ’87 crash, accounting irregularities (Enron, Worldcom) helped pop the tech bubble of the ’90s, and “rolling the dice” at Fannie & Freddie inflated the housing market with disastrous consequences. Each scandal led to an avalanche of new regulations atop the snowpack, which never entirely melts away and – more importantly – doesn’t prevent the next crisis.
Haldane summed it up nicely: “complex and detailed rules lead regulators and financial institutions alike to manage to the rules, tiptoeing right up to the hot red line at which a crisis can be triggered.”