Cost Benefit Analyses Suck

Rick Santorum wrote in the Wall Street Journal this past weekend about his Economic Freedom Agenda.  Most of it sounds good.  This part bugged me though:

I’ll review all regulations, making sure they use sound science and cost benefit analysis.

One reason it bugs me is that Candidate Obama made a similar promise four years ago. The Obama/Biden campaign promised to go through the budget and “keep the stuff that works and stop the stuff that doesn’t.”

Sounds good, but it didn’t happen.  And there’s a good reason it didn’t happen.  That brings us to the second reason Santorum’s comment bugged me: It’s impossible.

Believing it is possible is a true marker of naiveté or willing deceit.

Cost benefit analyses sounds good to a lot of people. They teach you cost benefit analysis in college, after all, no?

But cost benefit analyses have a major weakness — they’re usually wrong.  They’re wrong for a couple of reasons.

First, cost benefit analyses can be made to say what you want them to say, so they are subject to the same political wills as if the cost benefit analysis didn’t exist. The cost benefit analysis serves one purpose: to fool those who don’t understand this about cost benefit analysis.

Publicly funded stadium, museum and public transit projects are notorious for such analysis.

Even when a cost benefit analysis is conducted with the best of intentions by supposedly competent folks, it is usually wrong. As I mentioned here, Richard Feynman said:

The first principle is that you must not fool yourself…and you are the easiest person to fool.

He said this because he specialized in finding where experimental physicists had fooled themselves while conducting their experiments.

Unfortunately, the only true cost benefit crucible is the real world — and even that can be misinterpreted, and often is.  Economists still debate whether FDR’s actions in the Great Depression helped or hurt.

The truth is that government programs face incentives different from the free market. There’s always plenty of reasons to keep a government program around, even when it clearly does not have desirable outcomes (e.g. minimum wage).

But, the main reason to keep a program in the free market is if its doing its job.  If its not doing the job, the market (or “us”) finds ways to use those resources better.

The restaurant where I had my first W2 job went out of business long ago. It was bulldozed and replaced with a convenience store that does brisk business and has for years.

Had it been a government program, the old restaurant would still be open, supported by your always forgiving tax dollars.  It wouldn’t have many customers, but that’s okay. Politicians will claim they are keeping folks employed. Voters can feel good about voting for them for that reason and the cooks in the restaurant feel great about getting paid to not work that hard.

Ask folks then if they’d like to keep the restaurant, they might be inclined to say yes, you know, to keep the workers employed.  Ask them if they eat at the restaurant, they would tell the truth. No.

Nobody would be able to see the true opportunity cost of all that. They would never have guessed that had the restaurant been allowed to go out of business, it would have been replaced by a convenience store that produces much more value for the community and more jobs. And higher paying jobs.

Ask folks in the community now if they’d like to put the restaurant back in place of the convenience store and most would laugh at you.

Why an infomercial may be better than a TV doctor

“Do you mind if I ask, what do you base that on?”

Learning to ask this twelve word question can improve your reasoning ability.

Many people put too much stock into what certain others say because “he’s a medical doctor” or “a scientist” or “an economist” or a “Harvard grad”.

You cannot determine if something is correct simply by establishing who said it.  That’s called the appeal to authority fallacy.  It is easily defeated by considering that other medical doctors, scientists, economists or Harvard grads say something different.  Discussion on such matters usually turn into an unproductive he said/she said.

One of physicist Richard Feynman’s specialties was reviewing the experiments of other people to find the holes in their experimental design and logic.  He once said:

The first principle is that you must not fool yourself…and you are the easiest person to fool.

Even experts can fool themselves.

A family member recently mentioned diet advice that the TV doctor, Dr. Oz, had shared.  It didn’t sound right to me.  I asked if Dr. Oz had provided any reasoning or evidence to support his claim.  Did he explain why that particular diet would work or if anybody had followed that diet and had success?  No.  

So, I asked, Why do you believe him?  

She replied, He’s a medical doctor.

I researched Dr. Oz’s advice on the internet and I couldn’t find any evidence to support it.  I looked for studies and individual stories from folks who claimed success following that advice.  I found one study that did not support his advice.  And I found no individuals claiming to follow the advice with success.

I thought this was a good example to illustrate that just because someone who appears to be expert says something, it doesn’t mean it’s true.  Dr. Oz may be correct, but I’d like to know what he based that advice on.

I always put more weight on results over opinions.

Which brings me to the infomercial.  I recently saw a part of an infomercial for a set of exercise DVDs.  The promise was that if you follow these DVDs for 45 minutes a day, six days a week, you too could look like some of the people in the infomercial.

I applaud the creators of the infomercial for showing me actual results.  That’s why an infomercial might be better than a TV doctor.  The infomercial sells you on results.  The TV doctor relies on his credentials.

An Honor

Russ Roberts, economist, blogger, author (among other things) added a new category to Cafe Hayek.  The new category is Dinner Table Economics.  He wrote:

I want to start a new category of posts here called “dinner table economics,” questions involving economics for talking about over the dinner table. I want to top my hat to Seth’s blog, Our Dinner Table that gave me the idea.

I am honored and humbled that my blog gave him the idea to start this new category.  Naturally I think that’s a great idea and I look forward to reading and learning from his posts.

The first topic of that category: a study linking vaccinations to autism.  One of the key studies showing this link may be corrupt.     Roberts writes:

What we talked about at dinner was whether it was a good idea to vaccinate and how would you know whether vaccination had side effects such as autism. This got us into a discussion of  what an experiment is, how reliable is an experiment, the ideas of causation and correlation, sample size, spurious correlation and so on.

Great topic.  Call something a ‘study’ carried out by ‘experts’ and it gains instant credibility with many people.  News anchors seem to love how that rolls of their tongues.  A new study out today in the Journal of such-and-such says that this causes an X% greater chance of that.

Tell people to be skeptical of studies and do some due diligence before drawing a  conclusion and they look at you like you must be thick.  It’s a study.  It was carried out by experts.  It’s peer reviewed. All good stuff, but none of it means it’s right.  Believing it’s right without looking into is faith.

I got early exposure to be skeptical of conclusions drawn on experiments and studies from physicist Richard Feynman’s book, Surely You’re Joking, Mr. Feynman! (Adventures of a Curious Character), which I highly recommend.  One of Feynman’s specialties was poking holes in others’ experiments.  If I recall, even in a discipline like physics, conclusions were often polluted by the bias of the experimenters and mistakes.

Back to the vaccination study.  A local TV news story a few years ago featured a child who developed autism soon after receiving his vaccinations.  I’d find such news stories better if the reporter consulted with folks like Mr. Feynman or Mr. Roberts to provide a more complete picture and remind us that one story does not establish cause and effect.

Even the skeptical me can be swayed by a personal story.  Such stories are powerful.  That’s why politicians love to have mascots (thanks for that one Sowell) to call on during speeches.   But, what we don’t realize is that we are often swayed by the exception (HT: Don Boudreax of Cafe Hayek), not the rule. And we might be wrong.