Bias
I read a pretty interesting article in the SMH today. Michael Duffy writes about a new book by Nassim Nicholas Taleb called "The Black Swan". It describes how "experts" in their field are often worse at predicting outcomes than a lay person or even random choice. (I always thought that Phil Gould was a lousy tipper). Of course, I've not read the book yet, having only heard about it today. I'm intrigued however and for many reasons.
Duffy drops a G.K. Chesterton quote into his article quite clumsily. "A person who no longer believes in God will believe in anything". I'll reserve judgement on that. Perhaps I'll post another time about my belief system, however it certainly doesn't involve the traditional happily-ever-after.
One piece that resonated with me was a point made about "narrative fallacy". It is where the cause an event is explained through a description of the events leading up to it. The premise is that this cause is only seen after the fact, and therefore it is often false. This effectively is hindsight bias - and this is why this interests me so much. After something happens, the fact that it did happen makes it appear much much more likely that it should have happened that way. i.e. I-knew-it-all-along. Or more importantly, you-should-have-known-it-all-along-and-done-something-about-it.
Connected to this issue is the fact that we humans tend to have a pretty poor ability to estimate risk at the high-consequence low-probability end. We always over estimate how likely such an event will occur.
Now why is all this important? Even if you have not understood much of what I've been saying hear, there is still a point to be had. These cognitive "problems" (hindsight bias, expert false prediction, poor risk estimation) cause chaos for me every day of the week. That is, every week day.
In my work, I have to deal with management of risks (to the business and to people), and I have to solve problems. The cognitive problems I listed effectively cause conflict in the workplace and limit my productivity. I'll explain...
Say there is a person injured quite badly in a sister factory, on the other side of the world. Every other plant has to implement improvements to ensure that this injury does not occur in their plant. Seems reasonable? Yes, but also no. Prior to the accident, we had not considered that this type of hazard existed. In many ways, we still don't. So is the problem that we react with hindsight bias or were we not good enough at recognising the hazard and thank-goodness we can learn from this injury. I think more than likely the former. And this is the opinion of everyone else in the plant and this causes conflict. Not only is time and money invested in the wrong way, the message is sent that local issues and knowledge are less important and trusted than upper management. This reinforces the barriers of mistrust between management and workers.
This type of thing happens all the time. I pull my hair out. I'll have to get a copy of The Black Swan. Sounds like it's right up my alley. Of course, that would simply feed my confirmation-bias.
No comments:
Post a Comment