Experts vs Trial and Error

Writing in the Wall Street Journal, Nina Teicholz casts doubt on the ‘conventional wisdom’ that saturated fat causes heart disease (thanks to The Pretense of Knowledge for the pointer).

Of course, Gary Taubes laid out much of the same story line in his book, Good Calories, Bad Calories. I mention it here and here.

Teicholz mentions President Eisenhower’s heart attack. She didn’t mention the additional detail that Taubes provided. His doctor cut his cholesterol intake and his cholesterol levels went up.

Teicholz, perhaps, summarizes the beginning of the Type II diabetic and obesity trends when unreliable health studies were used to guide the American diet:

As Harvard nutrition professor Mark Hegsted said in 1977, after successfully persuading the U.S. Senate to recommend Dr. Keys’s diet for the entire nation, the question wasn’t whether Americans should change their diets, but why not? Important benefits could be expected, he argued. And the risks? “None can be identified,” he said.

This is where I’ve gained much appreciation for what Nassim Taleb identified as the expert problem, as he describes here.

Enhanced by Zemanta

One reason bottom-up does better: Try, try, try again

In this post, I wrote that bottom-up systems tend to do better than top-down systems. In the following post, I expand on the #1 reason why I think that is.

I use to think feedbacks were the most important difference between productive and unproductive organizations or systems.

I don’t anymore.

I still think feedbacks are important and much can be gained from addressing feedback problems. But, I think something else makes the difference between good and bad: The number of trials.

I credit Nassim Taleb for planting this thought in my head. In one of his books he said something like capitalism doesn’t work because of profits and losses [feedbacks], it works because it induces a lot of trials and some of those happen to work.

I think Jeff Bezos understands this idea about trials. But, I don’t think many others do.

We see success stories after the fact and credit visionary leadership, brilliant insights and clever innovations, but this is too simple of a view of what really happened.

Since Taleb planted this idea I started noticing other things about success stories.

I noticed that for every success, there were dozens or more failures. Successful people are often brilliant. We conclude that to be the cause of their success, but that doesn’t account for the failures that are often led by equally bright people. Why did one succeed while the others didn’t? That gets tougher to distinguish.

A story Taleb tells about dolphins illustrates this well. We hear about dolphins pushing to shore people who are stranded at sea and conclude dolphins know what they’re doing and are kind and gentle. Maybe that’s true. But, we don’t hear from the folks when the dolphins took them the other way. Maybe the dolphins that pushed to shore just got lucky.

I also started noticing that success stories often includes information about how the discoveries were not planned, but were accidentally stumbled upon while the great leaders were trying to do something else. Sometimes these leaders even resisted going the direction the discovery led them because it didn’t fit their original vision.

Finally, I started seeing the dumb luck that is a part of almost every success.

When I look at success, I try to remember its causes can be elusive and that the easy explanations are usually incomplete.

Capitalism works well for us because it encourages a lot of trials.

I believe organizations can improve (not guarantee) chances of long-term success by employing the same incentives as capitalism. Encourage a lot of trials. Let failures fail and reward success.

We even have evidence of this respect for trials embedded in common, inspirational phrases like:

  • If at first you don’t succeed, try, try, try again
  • Fail often to succeed sooner
  • If you fall off the horse, get back on

But, perhaps one quality of the successful that I may have overlooked is that they aren’t afraid to fail.

You miss 100 percent of the shots you never take. – Wayne Gretzky

I didn’t fail the test, I just found a 100 ways to do it wrong. -Ben Franklin

I’ve never been afraid to fail. -Michael Jordan

Fragile society

Bad stuff happens. It’s how you respond and adapt to that bad stuff that matters.

Nassim Taleb coined the term Anti-Fragile in his latest book, Anti-Fragile: Things That Gain From Disorder. It’s a concept worth remembering.

I recognized anti-fragility around the time of Enron. While people were wringing their hands about how something like Enron could happen, I pointed out that corruption, deceit and failure happen all the time in every form of society.

The results are not pretty. But, we were lucky in capitalism that it was contained to a small segment of the economy and didn’t have much impact on overall society. In fact, the economy was resilient enough that it was hardly a blip.

Not only that, but we learned from it.

People learned the important anti-fragile lesson of not putting all your eggs in one basket, as many Enron employees had done by investing all of their 401k’s in Enron stock.

We also learned to be even more skeptical of things that seem too good to be true.

Those are good lessons in any form of society.

Contrast that with the Soviet Union. When it went down, the whole ship sunk.

As we make government more central in our lives, we should recognize that we also make society more fragile, less anti-fragile.

Localism

Something Nassim Taleb said in his EconTalk podcast reminded of a point Daniel Hannan made in his book, The New Road to Serfdom that I wrote about it here.

Taleb makes the point that local government is more effective than national government and one reason is that at the local level there is some skin in the game in the form of shame. That is, if you take advantage of your neighbors, they’ll frown upon it. But, at the national level, government is more about taking skin out of the game through bail-outs and insurance. Here is Taleb:

And government can be a local, neighborhood union. And then let’s figure it out from the history of countries that have been very successful, like Switzerland or Sweden, places like that. That people making the decisions are usually embedded in a community. And their skin in the game is typically shame. Because they are socialized by the community. Their skin in the game is shame. Whatever government official in Washington can make a mistake, and it’s a spreadsheet looking at him. It’s not someone in church on Sunday looking at him and making him feel shame. And that’s where the main difference is.

So let me go back to the point about government. It’s true, at the local, local level, there are some natural incentives. But at the national level, say in the United States, a lot of what government does is to remove skin in the game–bailouts, insurance policies, do-overs, ad hoc interventions.

Here’s Hannan’s version:

…localism under-girds the notion of responsibility: our responsibility to support ourselves if we can, and our responsibility to those around us–not an abstract category of the “the underprivileged,” but the visible neighbors–who, for whatever reason, cannot support themselves.  No longer is this obligation discharged when we have paid our taxes.  Localism, in short, makes us better citizens.

Update: The title of the post made me think of an inconsistency. The folks who advocate ‘buying local’ rarely seem to advocate ‘governing local’. Perhaps they should.

‘…losses encourage prudence.’

As I mentioned at the end of this post, last week’s EconTalk with Nassim Taleb, Skin in the Game, is worth listening to. He describes some history of how having skin in the game is a simple and effective risk management rule and how removing it causes problems.

In ancient Babylonia, architects who built houses that fell down and killed people could themselves be killed. As Taleb explained, ‘that simple rule outperformed any inspector.” And, yet, there were still architects there. Apparently good and/or confident ones.

Here is more of what Taleb had to say about the Golden Rule:

And of course we have the Golden Rule that we see in the Old Testament, which is a positive–up till then it was a negative rule: ‘Don’t do unto others what you don’t want them to do to you.’ And then the Golden Rule: ‘Do to others what you want them to do to you’ and so on. Up to then we had a civil rule. What you see behind this is the foundation of moral philosophy, as a foundation of ethics and a foundation of civil society. But in it we saw something much more potent–we saw the foundation of risk management.

I thought this was interesting, too, regarding parenting and letting kids grow up:

The expression in Lebanon, that the first 7 years you play with them (and protect them), the second 7 years you let them get in trouble and the third 7 years you advise them on how they got in trouble.

Like a candle in the wind

candle, candle in glass

A fragile system (Photo credit: Wikipedia)

Nassim Taleb leads off his new book with the perfect sentence to demonstrate his concept of antifragility:

Wind extinguishes a candle and energizes fire.

Antifragility is the word Taleb coined to describe resilient, adaptive and complex dynamic systems like the economy and our bodies.

Such systems adapt and get stronger from stressors — to a degree.

Attempts to remove stressors from such systems makes those systems more fragile, which  causes them to break with the slightest stress — the flame on a candle. Too big to fail.

When we expose our bodies to stress — like the pounding our body takes from sprinting — our bodies adapt by building our muscles and bones stronger. The result is a body less prone to injury. Laze on the couch, and you lose muscle and bone mass and break with modest stress.

When I read Taleb’s sentence I immediately thought about the innovation efforts at companies and in the economy. Innovation efforts are often made fragile because managers or politicians want to control it from the top, only supporting the ideas they deem fit — e.g. clean energy.

This unnecessarily limits innovation trials to a small group of experiments, each with a very low chance of success (all experiments have a low chance of success). Such innovation efforts are like a lit candle that managers hope will ignite a larger fire, but the wind of chance and non-adaptability blows the candle out before it can adapt and spread. Then the flame has to be continually relit.

Some companies and some areas of the economy get this right. Those innovation efforts resemble a fire — many small flames together that adapt and rage when a wind blows on it. Lots of things are tried. Innovation here is bottoms-up. It’s at the front-line of the company and the entrepreneur level of the economy and there’s a lot of it. The things that suit customers win out. When a wind blows on the fire, it spreads instead of extinguishes.

Why I may throw my vote away: Part II

I wrote about why I may throw my vote away here.  On his blog, Zombiehero posted the video below of Nassim Taleb, author of The Black Swan, on CNBC’s Squawk Box agreeing with me.

In the video, Taleb explains why he supports Ron Paul. The key point is at the 6:40 mark when the host asks Taleb what kind of chances does he give Ron Paul?  Taleb responds:

I don’t think in terms of chances. I’m supporting him, regardless of the chances. Whether he has 1% or 99%, I’m supporting him, because we have no other solution…it’s my duty as a citizen, as a person who lives here, as a taxpayer who doesn’t want to be hoodwinked….in the long run, by bureaucrats.