Second article in the series on Cynefin around complexity:
- differences with the systemic,
- characteristics of complex adaptive systems,
- risk management,
- difference between simple and simplistic.
Complexity
Systemic and Complexity
Some people think complexity is a subset of systems thinking when it is radically different.
All systems thinking methods assume that one can define an ideal future state and organize to close the gap. Even at the end of Peter Senge's book "Learning Organizations", you define the values you want everyone to have: you then define an end state and you close the gap. That's what agilists do most of the time too.
Complexity focuses on describing the present, identifying what we can change, what we can change, identifying where we can monitor the impact of this change (because we cannot change if we can't monitor), and in the group of things I can monitor, where beneficial results might occur or what else might be learned from it. It is therefore a much more economical approach. We manage the evolutionary potential of the present which means discovering more sustainable and resilient futures than we could have anticipated.
Anticipating them would have made things worse rather than better. So complexity is a completely different way of thinking.
Characteristics of Complex Adaptive Systems
1. Very sensitive to small changes
Dave Snowden brings up the subject ofbutterfly Effect : the fact that small things, combined with other small things can produce big effects.
This does not mean, however, that the same little things will produce the same effect elsewhere.
The term became famous for the example of butterfly wing beats that constituted a hurricane in Texas.
2. Retrospective consistency
It's not what you don't know that puts you in a difficult situation,
it is what you believe with certainty and which is simply not.
–Mark Twain
Just because it worked in the past doesn't mean it will in the future. Indeed, it is interesting to see that it is always obvious to everyone what should have been done after the fact – this is what happens when we do a project review. – but since we are in the case of complex adaptive systems, the data changes all the time.
Dave Snowden mentions several examples on this subject:
- The cobra effect : Following the growing number of cobras during the British colonial period, the government set up a bounty program for each cobra killed. Although it was a very effective strategy at first, people started breeding cobras to get more money. The government getting wind of the scheme shut down the program and the now worthless cobras were simply released into the wild which ultimately increased their population. There is also talk of the rat effect.
- “Anything new will always produce good results the first 2 or 3 times it is used”. Human beings respond to novelty which doesn't mean it's going to be scalable. There's a big phenomenon in the Agile community right now, where people are setting up very structured methods very quickly in one or two companies and using that to create revenue through certifications and training without let's see if it will last. It is therefore important to pay attention to methods that claim success from a limited number of cases.
Hence Dave's approach which is:
Discover what is known about systems and human decision-making from the perspective of the natural sciences and build methods and tools based on grounded scientific theories rather than past observations and a limited number of cases. It also makes it possible to reject techniques a priori: they are false because they make fundamentally bad assumptions about the nature of systems or about the nature of cognition.
In reality, the more we guard against past failures, the more we increase the probability of future failures. That's not to say you don't learn from it, but you don't learn rigidly based on understanding the past, it's a different process: we tend to talk aboutlearning lessons rather than lessons learned. We need active real-time feedback mechanisms from multiple human sensors because it gives us control, it needs to be real-time and not process-based.
3. Proximity and Connectivity
The people we are close to fundamentally influence who we are. It is therefore a key control mechanism in complex adaptive systems. You can only manage a certain number of things, but close interaction is one of the things to manage.
4. Dangers of confusion
Correlation vs causation
The common pattern is to compare a large number of successful companies, identify what they do, and tell yourself that doing the same will get you the same result. The problem is that this does not take into account the context. Dave Snowden's example is as follows:
If a hundred agile teams have a project manager who regularly has a bowel movement, it is not by basing your recruitment of project managers on his habits to go to the toilet that you will have the guarantee of the success of your projects.
Even if regular bowel movements are associated with low levels of stress, there could be a link but it is not a causal link.
Simulation vs prediction
Nowadays, we have the possibility of carrying out all kinds of simulations by means of computer tools: for example the movement of a group of birds moving in the sky.
However, it is not possible for us to actually predict whether they will move one way or another and this difference may be significant.
So it is this desire for determinism that is common and dangerous.
5. Keep your options open
Basically, premature convergence is the main danger of complexity. Converging too quickly on a solution without leaving options open.
We are talking here about the shift from a single and secure design to multiple contradictory Safe-to-fail experiments to prevent premature convergence.
We can then make a connection with the concept of real options (not to be confused with financial options), a real-world decision-making tool stating that: options have (intrinsic) value – so having multiple options has value, options expire and that one should never commit too early in a direction without knowing why.
Risk management
The strategic shift being made around the world right now is to go from robustness to resilience.
Robustness is "surviving unchanged" while resilience is "surviving changed": we go through something from which we come out different but keeping a coherent identity.
Distributed cognition
Around this roundabout are: a football pitch, Intel's European headquarters, 3 banking companies, a large shopping mall, a regional hospital and a major road exit. It never jams and has the highest traffic flow of any European traffic junction and the lowest accident rate.
Indeed, it has limits but which allow emergent behaviors. So as a driver entering the roundabout, I can choose which path to take based on the flow of traffic, which distributes a lot of the cognitive process to the drivers themselves. They even slow down when they enter and are much more careful.
Creating a system where people slow down and pay more attention from a cognitive science perspective is absolutely critical. Human beings are very good at getting them to act adaptively, but if you don't put them in a high state of alert they will just continue to act in the normal way.
Centralized cognition
When we set up red lights, it is the queuing theory that is implemented.
We start to pile up and if it happens that the lights stop working, no one can cope.
The same comparison had been used by Bjarte Bogsnes in his presentation of Beyond Budgeting that I had described here on the blog of Good!.
Simple but not Simplistic
There's a huge phenomenon going on right now about the desire for something simple, popular, easy to understand, which gives us a simplistic formula. This is called a fashion effect.
Six Sigma is to innovation, what Sharepoint is to knowledge management and what SAFe is to Agile: a desire to create something very simple, very familiar and very structured to prevent people from having to deal with the reality of life or the reality of change.
– Dave Snowden
Just because something is popular and works doesn't mean it will last.
So making things structured for senior executives when they need to live in an unstructured world doesn't help them in any way. This only increases the possibility of considerable disasters downstream. We have to keep the ambiguity and change the way we work. (cf. Correlation vs causation)
Conclusion
The theory of complexity allows us to open our field of perspective by giving us this different view of the world. Indeed, with the technological acceleration that we are experiencing, uncertainty is inevitable and we must learn to manage this doubt rather than trying at all costs to eliminate it. The nuance that complexity brings to the system can thus help us to transmit this change of model of thought necessary for a thorough innovation of the companies and a constant adaptation of the teams.
The other items:
- Part 1: The Cynefin framework
- Part 2: Let's talk a bit about complexity
- Part 3: Some additional notions
References
- Making Sense of complexity: https://www.youtube.com/watch?v=y6RfqmTZejU
- Simple but not simplistic: https://www.infoq.com/fr/presentations/scrumday-dave-snowden-simple-but-not-simplistic
- An excellent article from SOAT: http://blog.soat.fr/2014/01/le-framework-cynefin-et-la-gestion-des-connaissances/
- Wisdom of Crowds: https://fr.wikipedia.org/wiki/La_Sagesse_des_foules
- Strategic Understanding with Prof. Dave Snowden: https://www.youtube.com/watch?v=PFi9mIlp2NY
- Think new, Act new: https://www.youtube.com/watch?v=s8SayvnfQi0