Friday, November 23, 2012

Confirmation Bias as a form of Lying

Dear KC,

We lie to each other all the time. I do not mean conscious untruths, which I am sure happens more than we would care to admit, or white lies that we say to make others feel comfortable. I am not sure anyone else would define it as lying, but it bears the same relationship to lying as involuntary manslaughter does to murder. It is not the intent, but the ability to know better that's the problem.

It's interesting watching or participating in those idea exercises that all seem to come from the same manual in business school. You know what I mean. A facilitator or teacher asks some questions or has some prompts presented on flip charts or poster board. Everyone is then asked to write down their thoughts on sticky notes or posters, and then group and analyze the results. Which is fine, and is often useful enough in class to generate ideas. It's not something I've managed to use yet to teach stoichiometry, for example, but perhaps it's just a sign of my own lack of creativity.

What I am not sure of is how useful the process is in identifying problems or giving non-obvious feedback. I've seen the process now a few times, and each time it generates a list of ideas or solutions, and it clearly identifies which ideas are in the majority. I'm not sure how different it is from a poll, which is a lot quicker, or how identifying what everyone already knows is useful when everybody already knows it. I presume it's useful if you have management or administration that's completely out of touch with its employees, which likely means you have bigger problems with regards to communication. It is kind of a problem in the same vein as XKCD's heat map; determining the obvious and making it seem like new information.

At some point confirmation bias and cherry picking have to take over. In any data gathering exercise the majority of people are going to state the obvious. The obvious will then confirm what everyone expected; there will be great rejoicing. Any new datum, something unusual that might be important, or runs counter to everyone's expectations, stands a chance of being drowned in the sea of familiar or comfortable ideas.

The attitude previous to the Great Recession seems a clear example of this. In The Signal and the Noise Nate Silver suggests that the signs of the coming recession were not hidden or inaccessible - what was missing was the desire and will to evaluate the evidence without a bias for what we wish to be true. The failure was not failing to see the recession coming. The failure was in only paying attention to data that confirmed that everything was fine and would continue to be fine. I wonder if a group of financial professionals had gotten together before 2008 and did the 'poster/stickynote' exercise if they would have managed to see the evidence without bias, or if it would reinforce their already held biases. I suspect reinforcement. (And if you haven't read Nate Silver's book - do so if you have any interest in the topic. It's a well-written explanation of a complex issue that at least seems like it didn't dumb down any of the important concepts.)

More prosaic examples would be protective parents that only pay attention to anti-vaccination information, or the creationist who can read a hundred articles on evolutionary theory and only see contradictions. To see beyond what you want to see requires that we remind ourselves that our bias is larger than we believe, and that the bias in others is probably smaller than we think.

This not a critique of the process, or a damning of flip-chart idea generation. Predicting developing problems or opportunities is difficult. It may be this is the best method and all methods have their problems. Where the lying comes in is in how often we seem to deliberately pretend that confirmation bias happens to other people. In today's world, with its ubiquitous access to information, ignorance can no longer be an excuse. If a group is generating ideas, then not mentioning such pitfalls amounts to tacitly allowing it to occur.

I'm not sure this type of lying has a word, but it needs one. We have white lies and (presumably) black lies, lies of omission, and bluffing. Lying by allowing our natural human tendency to engage in self-delusion needs its own label. Accusing someone of lying sounds far to strong for what this is. In our culture accusing someone of lying can derail a discussion on the nature of truth rather forcefully as egos and insulted feelings take over, but calling it something would allow us to highlight it when it occurs.

Sadly, I will not be the one to name it. Like Leonard of Quirm, if I was to name something it would be "the-lie-that-results-from-inadequately-accounting-for-psychological-processes-common-to-all-humans-in-a-world-where-such-processes-are-easily-knownable". Not very catchy. Psych-lie? Probably not.

It did get me thinking though. What do I currently hold to be true, that is clearly false? What confirmation bias' have been at play in my own teaching career, what scientific 'facts' have I never questioned? I'm most happy criticizing anti-vaxxers or holocaust deniers - areas where we know that the other side is wrong through ample evidence. In what must be a fairly obvious beam in my own eye, I have never really looked at teaching in the same way.

So I started investigating.

Regards,

Ron

No comments: