For years I had been slipping in my discipline to run regularly. As a final ditch effort, I joined the running group I used to run with some 15 years back. Well, it is not the whole group per se. A few of the old runner friends who still get together to run.
That has made all the difference. Not only am I running lot more regularly now (today was fourth successive day of run), the peer pressure also has improved my speed. Valerie pushed me to go the extra mile and make it a 4 mile run today and we did it at a sub-10 min mile.
My solo runs are far slower.
by: Chris Clearfield and Andras Tilcsik
I got to know about this book from the Bookclub group in our office. I was actually reading a different book. But the synopsis looked interesting enough that I paused on the other book and started reading this.
This book deals with catastrophic failures like the blowing up of Challenger, or the Deep Horizon blowout or the wrong movie announced as the winner at the Oscars … and tries to find out what are the common traits and how to avoid potential such failures in the future.
The key take away is to realize that a system needs to have two different traits to give rise to catastrophic failures. One is that the system has to be “complex”. By that it does not necessarily mean scale – but systems where their parts are more likely to interact in hidden and unexpected ways. When something goes wrong – multitude components fail and it is difficult to understand the root cause.
The other aspect is “tight coupling”. Loose coupling means there is enough slack that if one component fails, others won’t cascade. Tight means if one fails – others will start failing too. And it cannot be stopped. Ironically, safety systems are the biggest single source of catastrophic failure in complex, tightly coupled systems.
In wicked systems – not much feedback – (as opposed to kind systems – frequent feedback), this becomes even more problematic.
Couple of tricks the authors suggest include:
(*) Subject Probability Interval Estimates: instead of predicting yes or no or 99% vs 1%, predict at different intervals.
(*) Premortem – assume things have gone wrong. Now look back and predict what might have been the reasons
A quote I liked a lot: “We construct an expected world because we can’t handle the complexity of the present one, and then process the information that fits the expected world, and find reasons to exclude the information that might contradict it. Unexpected or unlikely interactions are ignored when we make our construction.”
Some watch outs that are good pointers to corporate leaders too:
More often than not, we don’t take into account how luck is often the reason systems have not broken down. We take that as a reason to believe the system is fine. (outcome bias)
Support dissenting opinions by speaking the last as a leader.
Our tendency for conformity can literally change what we see. Diversity in a team feels less familiar and feels less comfortable. There might be discomfort, but we tend to be more objective and are less likely to go along.
Homogeneous groups create comfortable feeling of familiarity. This unfortunately leads to doing less well in complex situations AND feeling confident about the same wrong decisions.
Some other interesting things: The most frequently used diversity programs didn’t increase diversity. In fact, they made firms less diverse. Voluntary diversity training is what yielded results. Managers need to feel it was their decision to participate.
Anyways, it is a very good read. Anybody will find some aha! moments from life and work in this.