I listened to two podcasts in row that happened to touch on the subject that inspired me to start this blog — changing one’s mind.
I’ve always been fascinated by events, arguments, rhetoric, conversations and observations that get people to reconsider their previous ideas on how things work and possibly change their minds.
The first podcast was a Dennis Miller Show podcast of his interview with David Horowitz from September 28. Miller and Horowitz were discussing the liberal mindset and Miller makes what I think is an apt description of folks who have difficulty changing their mind (and not just liberals).
Miller says (emphasis added):
I don’t view it as a craziness. People are too easy on them when they say they’re crazy. I view it as an obstinacy, a non-curious obstinacy that infects their lives.
I can’t tell you how many people have said to me, ‘how could you work with O’Reilly?’ I go, ‘You ever watch the show?’ They go, ‘No! I wouldn’t watch that show!’
Non-curious obstinacy. I like that description and I like Miller’s example.
When you encounter folks with a non-curious obstinacy, it’s not worth discussing whatever it is they are going on about beyond asking if they’ve ever given it a try or if they can describe specific examples on which they’ve based their conclusions.
More often than not, the answer is ‘no.’ If so, you can respond, ‘let me know when you can provide specific examples and I will be happy to discuss it then. Now, let’s talk about something else.’
The second, a Freakonomics podcast, The Folly of Prediction. The show host, Stephen Dubner was speaking with guest, Phil Tetlock, psychologist at Penn. Tetlock has extensively studied folks, especially ‘experts’, who make predictions about things ranging from the economy, the stock market, politics and sports outcomes. Dubner asks:
…we’re getting into the nitty gritty of what makes people predict well or predict poorly. What are the characteristics of a poor predictor?
Tetlock answers (after a brief pause):
I think an unwillingness to change one’s mind in a reasonably timely way in response to new evidence. A tendency when asked to explain one’s predictions, to generate only reasons that favor your preferred prediction and not to generate reasons opposed to it.
Dogmatism (def. The tendency to lay down principles as incontrovertibly true, without consideration of evidence or the opinions of others). That’s another word for non-curious obstinacy.
Folks don’t like to be wrong. I didn’t like to be wrong, though I have gotten better in this respect. I still don’t like it sometimes, but I get over it. We’re trained from a very young age that being wrong is not a good thing.
But, I’m not sure why. Being wrong shouldn’t be a bad thing. Just about all learning in life is done based on trial and error. That is, we try something based on how we think it should work, and then we find out that we were wrong. We then try it differently, until we find something that does work.
We tend to learn these lessons best where we pay the costs or consequences for being wrong. I’ve only turned the wrong way down a one way street once or twice in my life. Now I’m pretty good at checking the signs and flow of traffic before I turn.
We tend not to learn the lessons as well when we don’t pay the direct costs or consequences for being wrong. Politics is a good example. I can vote for someone because I like the way they dress or speak, or because he’s ‘better than the last guy’, or because he’s cool, or I better identify with the others who are voting for him. And, if I’m wrong, I don’t really care. We were all wrong. But, we don’t really even have to admit that. There are plenty of ways for us to explain it away…”things were worse than we thought,” or “at least we had good intentions, we were trying to help, unlike the other side who is only for the fat cats.”