Critical thinking skills are in the tank. Rarely are the merits of an issue debated. Discussions go down several common rabbit holes. I wrote about one in a previous post, where you or your opponent will not be honest about the other’s position.
Another common rabbit hole is an appeal to authority, or expert fallacy. In a discussion on how to fix health care, a medical doctor chimed in with an opinion that agreed with my opponent. The opponent declared, “This guy works in the profession. I trust his opinion.” The discussion went south as we debated why being a doctor doesn’t make your opinion on the organization of the health care necessarily true. When a medical doctor who agreed with my view chimed in, the same opponent proceeded with ad hominem attack to discredit him not bothered in the least by his blatant inconsistency. I no longer argue with this guy. I use him as practice for identifying fallacies.
Not only do we commit expert fallacy when the the so-called expert agrees with our position. In Fooled by Randomness, N. N. Taleb writes about a different variation of expert fallacy. He distinguishes between fields where being an expert can help and where it doesn’t and argues that few people make that distinction. Dentists are expert enough in dental care to cure a toothache. We incorrectly confuse the type of skill the dentist has with the type of skill economists or investment managers have. Outcomes in these areas are more random and success is based more on luck.
Why do we fall victim to these expert fallacies? Because critical thinking skills are in the tank. We accept it. Does it matter that “experts” are proven wrong all the time? Does it occur to us that “experts” are human and are susceptible to the same biases as the rest of us?
I’ve been told you’ve found a good doctor if she highly encourages you to seek out a second opinion because she could be wrong. That’s a good litmus test for other “experts” as well. Think twice if they don’t readily admit that they might be wrong. I’m not wrong about this.