Ed Note: Bill sent the following note under the subject line ‘Uh oh.’
‘I’m on the Trans Canada railroad for the next four days. Alas, they tell me that the Wi-Fi isn’t working. And the 3G or 4G only works sometimes. So…no Diary unless I can figure something out.’
Today’s essay is taken from his latest book, Hormegeddon: Why Too Much of a Good Thing Leads to Disaster.
Prepare for hormegeddon
This book has a modest ambition: to catch a faint glimmer of truth, perhaps out of the corner of our eye.
It is a phenomenon I call ‘hormegeddon.’
German pharmacologist Hugo Schulz first described its scientific antecedent in 1888. He put small doses of lethal poison onto yeast and found that it actually stimulated growth. Various researchers and biochemical tinkerers also experimented with it in subsequent years and came to similar findings.
Finally, in 1943, two scientists published a journal article about this phenomenon and gave it a name: ‘hormesis.’ It is what happens when a small dose of something produces a favourable result, but if you increase the dosage, the results are a disaster.
Giving credit where it is due, Nassim Taleb suggested applying the term beyond pharmacology in his 2012 book, Antifragile.
Disasters come in many forms. Epidemic disease is a disaster. A fire can be a disaster. A hurricane, an earthquake, a tornado. All these natural phenomena are the disastrous versions of normal, healthy environmental processes.
But this book is about another kind of natural disaster. Public-policy disasters.
Generally speaking, public-policy disasters are what you get when you apply rational, small-scale problem-solving logic to an inappropriately broad situation.
First, you get a declining rate of return on your investment (of time or resources). Then, if you keep going — and you always keep going — you get a disaster.
The problem is these disasters cannot be stopped by well-informed, smart people with good intentions, because those exact people are the ones who cause these disasters in the first place.
Beyond our control
Hormegeddon is my shorthand way of describing what happens when you have too much of a good thing in a public-policy context.
Economists describe the ‘too much of a good thing’ phenomenon as ‘declining marginal utility.’
The idea is well known and understood: you invest money. The first money you invest produces a good return. Then, the rate of return goes down…eventually to zero.
When you get below the rate of return, on a ‘risk-free’ Treasury bond for instance, you’re no longer earning anything for the risk you take; you’re losing money.
If you keep investing at this point, your losses will increase. What was just a bad investment becomes a disastrous investment.
Economics has no special term for this stage — where marginal returns sink below zero, and you begin to get negative returns that, eventually, lead to hormegeddon.
Despite its prevalence in this world, hormegeddon trudges on in anonymity, ignored by just about everyone on the planet.
The reason is simple: Our intellectual traditions give us no purchase on it. Western thought is largely dominated by rational problem solvers.
They presume that individual human beings can consciously determine where they want to go and how to get there.
I will pass over the fact that not a single human being on the planet actually got where he is by rational thought alone. Instead, we are all products of forces we can barely begin to fathom, let alone control.
Errors of a special sort
Few people can stomach the idea that public life is out of the conscious control of the authorities in whom they have placed so much faith.
They lack what Nietzsche referred to as an amor fati…a faith in, and an affection for, fate.
People don’t like fate. Fate is the bad stuff that happens when no one is in charge, when chaos reigns.
Instead, they believe in the ability of right-thinking experts to ‘do something’ to bring about a better outcome than fate had in store for them.
They want a leader who will slay their enemies and bring the home team to victory. They want officials to deliver full employment, someone else’s money, the America’s Cup, and free beer on tap 24/7. They want someone in the driver’s seat who will take them where they want to go.
But where do they want to go?
They don’t know. And history is largely a record of fender benders, sideswipes and pileups on the way there — a place, it turns out, they really shouldn’t have been going in the first place.
History ignores the trillions of very good decisions made by private citizens in their private lives. We don’t see the calculation of the boatmen, bringing their barks to shore just before the tide turns.
We hardly notice the bowman, who sends his arrow to a spot just a few feet in front of a racing rabbit. Nor does history spend much time on the brakeman, who carefully brings the 11:07am from New York to a halt directly in front of travellers standing on the platform at Pennsylvania Station in Baltimore.
But the competence of the brakeman, boatman, and bowman make us overconfident. If we can bring a train to rest at exactly the right spot, why not an economy? If we can impose our will, by force, on a rabbit, why not on Alabama? If we can drive a car, why not a whole society?
It seems reasonable enough. And it agrees with our core intellectual bias — well established since the time of Aristotle and re-established during the Renaissance — that we are able to see, understand, and direct our future.
But if that were true, history would be a lot less colourful than it is. What actually happens is that people take on big projects. And fail miserably.
For Markets and Money, Australia