Understanding Safety / SMS More

System Thinking & Practical Drift

The models used in investigation and analysis are based on assumptions and simplifications. Rarely a model considers the complexity of the organisation and the industry. Traditional models require a specific point of a deviation along the timeline, a component failure. This assumes a liner modelling. But a system is not about parts and components, but relationships between these two. It is about the whole, about the sum of individual functions.

But a closer look into data reveals that even perfectly run organisations can fail, simply because of performance drift along time. These failures are a result of a systematic migration of organisation behaviour, which is affected by financial aggressive and competitive environment.

In simpler words, the model of cause and effect, the reaction for every action, is no longer catholically valid. Energy containment using barriers and lagers of defences, do not guarantee an accident free workplace. It is not extreme to accept that maybe we are not even 100% sure of our understanding of complex systems.

Technology has greatly advanced and we can now create marvels of engineering and science. When isolated, these marvels have precise properties and specific behaviours. However, put in a complex multi-changing environment with competitive interactions, regulated societies, with exponentially multiplying interactions and interdependencies; and the models no longer represent reality.

Complexity is a result of various interactions and diverse parts of components that make the number of possible outcomes so high, that we find it almost impossible to foresee the results and we accept that failure is always probable. Complex systems are therefore called open, as it is difficult to frame the boundaries and each component is ignorant of the behaviour of the system as a whole. A complex system is bounded by the economic boundary, the workload boundary and the safety boundary. Beyond the economic boundary there is financial bankruptcy. Beyond workload boundary the resources are inadequate to complete the task, whilst beyond the safety boundary we have a catastrophe. Any task of operation is bounded by the three boundaries in a manoeuvring space and every attempt to extend one boundary requires a trade-off from one of the others. As such, complexity is a feature of the system itself and not each individual component as the interactions of the components are non-linear, whereas input and output are not symmetrical (small inputs can cause large outputs).

On top of all the other factors, we have a controlling factor, the human factor, which brings up even more complexity, as the human decision making cannot be 100% modelled. The local rationality principle, as psychology calls it, is what complicates these models. Humans tends to do what makes sense given the situation, operational pressures and the norms and culture existing at that specific time. Decision making can never be optimal, as we rarely have perfect access to all relevant available information, with adequate time for consideration and with clear identified goals or preferences. But even if we had those available, our information processing system is nowhere close to the capacity required to process them.

Drift occurs because of uncertainty and competition, in small steps (most of the times invisible) at a time. Each time the accepted norm is pushed a bit further than optimal performance. The new situation becomes the new norm. Then rules follow the evolving practises instead of the practises being based on rules. In complex systems though, past success cannot guarantee future success. A currently working norm is not to be taken as a source of confidence for future conditions.

A big impact on drift is derived from the starting point and the interactions. On each interaction there is a form of exchange – an adaption technique, to achieve continuous operation. This results in a drawback of the protections set in place to prevent the failure. But survival and process are completely different things. System complexity and drift thinking aims to stop the hindsight of just tracing back the events and finding the critical point in time that caused the failure, but without really identifying how we could have identified this drift prior the critical point.

We need to recognise that we are faced with two kinds of variables. The explanatory and the change variables. Explanatory variables are those that can explain why an event happened, but they are not necessarily the same variables as the ones we should be shifting our attention to prevent re-occurrence of the events. Reinforcing a barrier in one corner can create vulnerabilities elsewhere due to the explosion of relationships and changes. We have to give up the analytic, reductionism method, of breaking down the system into smaller components and going further down and in. The right approach is to go up and out and step outside of the system and look as it as a whole. We have to make active comparison with the most complex system known, the human body, where malfunctions in one internal organ can result in a seemingly unrelated effect in a different point, or a minor change in the DNA which brings out so many differences due to the resulting relationships.

What keeps alive the belief that safety is a cause and effect situation, is the human assumption of reality, keeping events unnoticed or misunderstood. The rigid perception and belief which renders complaints and reports disregarded, as well as warning signals by outsiders, is what needs to be tackled first. This requires a commitment and a shared care and concern, in realistic and flexible norms that continue to reflect the constant monitoring, analysis and feedback.

Safety cannot be guaranteed by increasing the reliability of individual components and the defence layers against component failures. Today’s specialisation in every field makes it even more difficult for people to understand the system as a whole.

Practical Drift is the result of people acting in ways to better align their perception of demand at the tie. Rules become less important whilst task demands become more important. This is the nature of human adaptation. Paradoxically, complexity resulted from the need to make each component more reliable!