in Failure, Human Factors, Industrial Engineering

Failing to Plan

I’ve been reading through Nick Sleep’s (from Nomad Investment Partners) letters and they are incredible.

In one letter, from 2008, he quoted Nassim Taleb: “‘Capitalism does not teach slack, it teaches optimization.'”

Sleep goes on: “That is capitalism teaches that assets must be worked hard, outputs maximised, returns as high as can be.” (Sleep was praising the virtues of ‘slack’ and to keep some spare cash like in the case of an emergency fund.)

We’re all optimization experts in our modern market economies. We all pray at the alter of efficiency.

I had a front-row seat at that “church”.

As a young student studying industrial engineering, we explored how to build things faster, with less waste and of higher quality. We studied the science of how to do just that.

But while the bulk of our studies were focused on maximizing efficiency, there were whole studies and research within industrial engineering that explored more about how systems fail. Enter Human Factors Engineering (HFE).

Systems are fallible — period. Especially when we include individuals, groups and how they interact with machines. Proper selection of which things that machines do best and which ones that humans excel in (i.e. function allocation) is just part of the consideration of many in the HFE discipline.

I believe the main reason why HFE resonated with me so much was because it matched more closely to what I understood the real world to be: imperfect, full of risk and far from resilient. HFE wasn’t preaching about how to maximize output like the rest of my engineering studies, it was teaching how to think about maximizing potential (optimizing the journey as well as the destination). (I don’t think I was able to articulate this so succintly until now.)

The fact that things inevitably fail makes complete sense to me and that blame placed on one “part” of a system over another wasn’t accurate. The root cause was always more complex and largely systemic.

This was a shift in mindset — that parts are indistinguishable from the holistic system and that systems are fallible no matter how resilient they seem to be on the surface. This one mental model ingrained in me has made me into a better business thinker and investor. (Coincidentally, the term ‘mental model’ comes in large part from HFE.)

I believe we can assess risks ahead of time, identify and isolate them and prevent them from even happening in the first place. I still do believe that to be the case in investing, in design and in life.

But what makes it even more fun and challenging is what’s highlighted by Carl Richards of Behaviour Gap, “Risk is what’s left over when you think you’ve thought of everything.”

It’s a never-ending game of another think coming.