# Intuition and logic

Yesterday, Bryan Caplan of Econlog blog wrote about an excerpt from Daniel Kahneman’s book, Thinking Fast and Slow.

Kahneman suggests that when folks confront a difficult-to-answer question, we substitute a not so difficult-to-answer question and go with the substitute answer — whether it applies the original question or not.

For example, you may be asked, How happy are you with your life these days?  You may substitute that question for, What is my mood right now?  Then you go with the answer to that substitute question, even if it isn’t a very good answer to the original question.

Bill Frezza, writing on Forbes.com, covers Kahneman’s with additional detail:

Kahneman’s Nobel Prize-winning research ranges across a lifetime of psychological experiments, all of which point to the inescapable conclusion that we have two systems of thought that are at best loosely connected. Bridging that gap is the key to keeping democracy from destroying itself. Let me explain.

The first bit of mental machinery, which Kahneman blandly calls System One, works 24/7 to keep us out of trouble, while alerting us to fleeting opportunities. Appropriate for a species that is both predator and prey, System One lives in a world of snap judgments, extensible metaphors, ill-informed biases, and loosely constructed rules of thumb. We sometimes call this decision making apparatus intuition. Man’s intuition is sophisticated enough that it has helped us thrive across a variety of ever changing environments.

Despite its utility, System One is often wrong, especially if numbers are involved. For a trivial example, answer quickly: If the sum of the cost of a ball and bat is \$1.10 and the bat cost a dollar more than the ball, what does the ball cost? Your System One answer (most likely wrong) is good enough to avoid mistaking a hungry lion for a tasty chicken. But it’s not good enough to build an airplane or design an effective income tax code. (The answer is a nickel, not a dime.)

System Two is associated with enumeration, computation, objective analysis, and complex chains of logic. It is our rational brain. Kahneman’s work shows that even scientists like himself use System Two very sparingly, calling on it only when System One asks for help. In addition, in order to function at the highest levels, System Two requires training, discipline, concentration, skeptical and impartial evaluation of purported facts, and the ruthless elimination of contradictions.

…when it comes to electing increasingly all-powerful political leaders whose policies can unravel entire industries at a whim, it’s all System One all the time. Our political discourse has no room for System Two thinking. In fact, the opposite is true. Every ounce of campaign energy and the lion’s share of media attention are devoted to manipulating our snap judgments, extensible metaphors, ill-informed biases, and loosely constructed rules of thumb.

It sounds a bit like right and left brains, intuition and logic.  But, it appears well articulated and helpful.

Allowing our System One and System Two selves to get to know each other better may help improve the weaknesses in our thinking.

Maybe we’ll better learn to say things like, “well, my intuition is telling me…, but I need to think about it some more,” instead of adamantly going with our intuitive answer and not giving it another thought.

Kahneman may also help answer a great question posed by a Freakonomics podcast listener last week:

Why do people feel compelled to answer questions that they do not know the answer to?

It could be System One kicking in.  Intuitively, we feel that giving some answer is better than giving no answer, even though that’s not true.

I base that on my own experience in giving presentations.  Early in my presentation career, I tried to answer every question.  I believe that was my instinctive, fight-or-flight response based on System One.

Somewhere along the way, my System Two found a more effective approach.  When asked a question, instead of rushing into the answer, I would pause and repeat the question to make sure I understood it.   This also helped others in the audience hear the question.

At the same time, I was thinking about whether I knew enough to answer the question.  If I didn’t, I said something like, “That’s a good question.  I haven’t been asked that before and I’m not sure I can give you a good answer.  Let me check that out and find the answer for you and get back with you.”

It seems to work better than making up stuff or thinking out loud.

# Markets in Everything: Souls

In this post, the Freakonomics blogger Stephen Dubner posts an offer from Caleb B. who is looking to buy souls for \$10 to \$50.

Here’s his offer:

…what is it about the idea of a soul that even people who confess to not have one are hesitant to sell it? I have been trying, for the better part of ten years, to buy a soul. I’ve offered a dollar amount, between \$10 and \$50, for someone to sign a sheet of paper that says that I own their soul. Despite multiple debates with confessed atheists, no one has signed the contract. I have been able to buy several people’s Sense of Humor and one guy’s Dignity, but no souls. Additionally, will any Freakonomics reader take me up on this? I’m willing to spend \$50 on souls.

One fellow, Jared, says he’ll take the offer:

Caleb B., I will absolutely sell you my soul. To be fair, this won’t preclude me from selling it again to other suckers who (a) believe in souls and (b) believe they can be readily transferred on purchase. To be clear I’m offering because I don’t believe (a).

Or does he?

In my opinion, Jared’s conditions make this a non-sell since he’s reserving the right to sell his soul again to someone else.  In other words, he’s trying to take Caleb’s money.

With these conditions, it would be easy for Jared to get his soul back if he ever decided he’d like to.  He’d just need to find a willing person to sell it to…again… and then buy it back from them.

I love Caleb’s approach.  It taps into what economist, Paul Samuelson, called revealed preferences or, as non-economists say, putting your money where your mouth is.

That is, we say we will behave one way and then we behave differently when we face the actual consequences.

I listened to two podcasts in row that happened to touch on the subject that inspired me to start this blog — changing one’s mind.

I’ve always been fascinated by events, arguments, rhetoric, conversations and observations that get people to reconsider their previous ideas on how things work and possibly change their minds.

The first podcast was a Dennis Miller Show podcast of his interview with David Horowitz from September 28.  Miller and Horowitz were discussing the liberal mindset and Miller makes what I think is an apt description of folks who have difficulty changing their mind (and not just liberals).

I don’t view it as a craziness. People are too easy on them when they say they’re crazy. I view it as an obstinacy, a non-curious obstinacy that infects their lives.

I can’t tell you how many people have said to me, ‘how could you work with O’Reilly?’  I go, ‘You ever watch the show?’  They go, ‘No! I wouldn’t watch that show!’

Non-curious obstinacy.   I like that description and I like Miller’s example.

When you encounter folks with a non-curious obstinacy, it’s not worth discussing whatever it is they are going on about beyond asking if they’ve ever given it a try or if they can describe specific examples on which they’ve based their conclusions.

More often than not, the answer is ‘no.’   If so, you can respond, ‘let me know when you can provide specific examples and I will be happy to discuss it then.  Now, let’s talk about something else.’

The second, a Freakonomics podcast, The Folly of Prediction.  The show host, Stephen Dubner was speaking with guest, Phil Tetlock, psychologist at Penn.  Tetlock has extensively studied folks, especially ‘experts’, who make predictions about things ranging from the economy, the stock market, politics and sports outcomes.  Dubner asks:

…we’re getting into the nitty gritty of what makes people predict well or predict poorly.  What are the characteristics of a poor predictor?

Tetlock answers (after a brief pause):

Dogmatism.

I think an unwillingness to change one’s mind in a reasonably timely way in response to new evidence.  A tendency when asked to explain one’s predictions, to generate only reasons that favor your preferred prediction and not to generate reasons opposed to it.

Dogmatism (def. The tendency to lay down principles as incontrovertibly true, without consideration of evidence or the opinions of others).  That’s another word for non-curious obstinacy.

Folks don’t like to be wrong.  I didn’t like to be wrong, though I have gotten better in this respect.  I still don’t like it sometimes, but I get over it.  We’re trained from a very young age that being wrong is not a good thing.

But, I’m not sure why.  Being wrong shouldn’t be a bad thing.  Just about all learning in life is done based on trial and error.  That is, we try something based on how we think it should work, and then we find out that we were wrong.  We then try it differently, until we find something that does work.

We tend to learn these lessons best where we pay the costs or consequences for being wrong.  I’ve only turned the wrong way down a one way street once or twice in my life.  Now I’m pretty good at checking the signs and flow of traffic before I turn.

We tend not to learn the lessons as well when we don’t pay the direct costs or consequences for being wrong.  Politics is a good example.  I can vote for someone because I like the way they dress or speak, or because he’s ‘better than the last guy’, or because he’s cool, or I better identify with the others who are voting for him.  And, if I’m wrong, I don’t really care.  We were all wrong.  But, we don’t really even have to admit that.  There are plenty of ways for us to explain it away…”things were worse than we thought,” or “at least we had good intentions, we were trying to help, unlike the other side who is only for the fat cats.”