Complexity is a great cauldron of failure and this month Harvard Business Review devotes it cover story to embracing it.
Gokce Sargut and Rita Gunther McGrath, collaborated on the first feature ‘Learning to Live With Complexity’ which explored the essence of ‘Black Boxes’ – complex systems which are a curious, and increasingly common instance in modern living, where failure to comprehend, predict and control are core qualities.
“Complex systems have always existed, of course – and business life has always featured the unpredictable, the surprising, and the unexpected. But complexity has gone from something found mainly in large systems, such as cities, to something that affects almost everything we touch: the products we design, the jobs we do every day., and the organisation we oversee. Most of this increase has resulted from ithe information technology revolution of the past few decades. Systems that used to be separate are now interconnected and interdependent which means that they are by definition, more complex.”
“Complex organisations are far more difficult manage than merely complicated ones. It’s harder to predict what will happen, because complex systems interact in unexpected ways. It’s harder to make sense of things, because the degree of complexity may lie beyond out cognitive limits. And it’s harder to place bets, because the past behaviour of complex systems may not predict its future behaviour. In a complex system the outlier is often more significant than the average.”
The distinction between ‘complicated’ and ‘complex’ is a powerful and useful one. “Complicated systems have many moving parts, but they operate in patterned ways.” For example, air traffic control. “Complex systems, by contrast, are imbued with features that may operate in patterned ways but whose interactions are continually changing.” A critical difference is the ability to predict the outcomes of the systems. In complicate systems you can, but in complex systems you fail.
The article goes onto explore a number of problems of complex systems that need to be appreciated…
- Unintended consequences – “events interact without anyone meaning them to.”
- Aggregate individual elements – the whole is greater than the sum of the parts.
- Legacy protocol – “policies and procedures remain in place long after the reason for their creation becomes obsolete.”
- Vantage point problem – can’t see the forest for the trees.
- Cognitive limits – “most executives believe they can take in and make sense of more information than research suggests they actually can.”
- Distraction – “focusing on one thing can prevent us from seeing others.”
A series of prescriptions offered embraces the failures that the systems will inevitably throw up through more sophisticated risk mitigation…
- Limit or even eliminate the need for accurate predictions
- Employ decoupling and redundancy.
- Draw on story telling and counterfactuals – explore the stuff that didn’t happen (eg. near misses).
- Triangulate – draw from a number of angles rather than looking for the ‘right’ one
- Exploit real-options approach – “making relatively small investment that give you the right, but not the obligation, to make further investments later on.”