We have far too much faith in “science”

Over at Cafe Hayek, Russ Roberts points to an article from Reuters, In cancer science, many ‘discoveries’ don’t hold up.  This is from the article:

 A former researcher at Amgen Inc has found that many basic studies on cancer — a high proportion of them from university labs — are unreliable, with grim consequences for producing new medicines in the future.

During a decade as head of global cancer research at Amgen, C. Glenn Begley identified 53 “landmark” publications — papers in top journals, from reputable labs — for his team to reproduce. Begley sought to double-check the findings before trying to build on them for drug development.

Result: 47 of the 53 could not be replicated. He described his findings in a commentary piece published on Wednesday in the journal Nature.

That’s not science, folks. That’s randomness or noise.

No surprise. Scientists, after all, are humans and like all of us they respond to incentives. They want to get their studies published, they want to get their names out, they want to prove their theories. This satisfies their egos and keeps them employed. They aren’t monks. They’ll report noise if it gets them attention.

This was confirmed later in the article:

“If you can write it up and get it published you’re not even thinking of reproducibility,” said Ken Kaitin, director of the Tufts Center for the Study of Drug Development. “You make an observation and move on. There is no incentive to find out it was wrong.”

Begley’s experience reminds me of what Gary Taubes’, author of Good Calories, Bad Calories, discovered as he reviewed landmark diet and health studies from over the past century. He found the “scientific foundation” on which much of the conventional wisdom and government dietary recommendations are based is shaky.

Here’s one of the best stories from the article that demonstrates the deception that we often call science:

“We went through the paper line by line, figure by figure,” said Begley. “I explained that we re-did their experiment 50 times and never got their result. He said they’d done it six times and got this result once, but put it in the paper because it made the best story. It’s very disillusioning.”

That should have been an important fact to mention before publishing the results.

I deal with this often in the business world too. Many folks play fast and loose with the facts. I often hear people support their position by saying, “Research shows that this is the best way” or “this is what customers prefer.”

Refer to “research” and most people will hear that and accept on blind faith that the research must be correct and correctly interpreted. Much to the chagrin of some of my business partners, I don’t.

I ask, “Would you mind if I took a look at that research?”

So far, about 10% of the folks I’ve asked showed me the research they referred to. The other 90% backed off their point quickly. And, of those 10% who did show me the research, I managed to point out several potential issues with the research method and interpretation that dramatically lowered the confidence in their conclusions in all cases.

I call that “pushing on the putty.” Backing your conclusion with flimsy research is like building a wall with putty. Push on it just a little bit and the wall crumbles.

Advertisements

Moneyball

I watched the movie Moneyball this week and enjoyed it. I’ve heard of the book and avoided reading it due to my own biases.

I’m a skeptic of statistical analysis, which often (even in the movie) gets confused with science.  My statistics-loving friends raved about how the book showed just how valid and effective the use of statistics is, even in a sport.

I see this confusion and misapplication of statistics as science in everything from how we run our schools, climatology, economics, fitness and diets and how we run our businesses and other organizations.  I’ve observed enough attempts at “scientific management” in my career to know that the use of it does not guarantee success — and sometimes can make things worse off (it certainly didn’t help in the housing crisis).

But, based on what I saw in the movie, this isn’t quite the case of baseball science that many people believe it is.  I realize movies simplify the story, but what I saw in the movie is more in line with what I have seen to be effective in real life:  focusing on meaningful facts over biases.

It wasn’t the use of statistics that improved the performance of the team.  Rather, it was the use of meaningful performance and output measures to overcome deep-seated biases in the coaching the recruiting staff.

This is demonstrated in one scene of the movie when the team’s talent scouts are discussing potential new players to add to the team.   This is a good-looking kid.  He has a nice swing.  Other teams like him. I like this kid.

I’ve seen this in real life all too often.  I’ve seen business programs die and good talent passed over for promotion because of similar biases.  Usually the bias is as simple as: That’s not what I’d do or That person doesn’t do the things the way I do them.  Very rarely is the true performance of the project or person even discussed.

The key insight of the statistician in Moneyball wasn’t the use of statistics, or sabermetrics, per se, but in using meaningful output measures to trump the biases. Baseball fans want wins. Wins come from scoring and scoring comes from getting on base. Good defense is a must, but not quite as important as scoring. So, instead of worrying about whether this was a good-looking kid, they’d worry more about whether he could get on base.  Facts trumped biases.

Thinking out loud

If we were microscopic and inside a live brain, we’d see a large network of neurons.  From that vantage point, it would be tough for us to understand how what was going on in that network related to the outside world.

Biological neuron schema

This thing helped me balance the checkbook today (Image via Wikipedia)

I wonder if the universe we see from our tiny spec is similar.  Not that we’re inside a big brain, but that the universe isn’t really what it seems to be from where we sit.

(Sorry, too many “How the Universe Works” shows over the holidays).

Let’s go

There’s a new Earth.

It’s only 600 light years away.

Considering our galaxy is about 100,000 light years in diameter, that’s pretty darn close.

But, still large by our technological standards.

Galaxies are so large that stars can be consid...

100,000 light years

If this planet happens to have spawned or attracted intelligent life that use radio frequencies, like ours, they might start picking up some of our broadcast signals in about 550 years.

The good news is, if its intelligent life started using radio frequencies anywhere near 600 years ago, we might start to receive some of their signals soon.

If we decided to send a probe, with current technology we might be able to get there in about (if my calc’s are right) 10 million years.

Better start brushing up on how to tesseract.

Good news and bad news

Good news…

A super-earth has been found.  It’s about 3.6x bigger than earth and only 35 light years away.

Bad news…

No reason for real estate markets to react.  It’s 35 light years away (that’s small on a universal scale, but a very long way at the speeds we can travel).

More bad news…

If it supports intelligent life that has learned to communicate with radio, like we have, within the last 30 to 35 years, we’d probably be hearing something from them by now.  I think.