NSCA Foundation

Leadership, and 'Human at Fault' thinking

RiskWise Solutions

By Goran Prvulovic, MOccHlth&Saf*
Wednesday, 01 February, 2017


Leadership, and 'Human at Fault' thinking

Not too many people will dispute that human behaviour and, in particular, human actions are commonly implicated in accident causation. The key word here is — implicated.

It is astonishing how quickly and easily we can travel from ‘implicated’ to ‘responsible’ and ultimately to human error being a ‘root cause’ of accidents. It often happens in the blink of an eye through several specific cognitive processes; mostly associated with the Western view of the world and involuntary effects such as the fundamental attribution error. Not surprisingly, this internal bias is often manifested in accident investigations where we often find people to be responsible for accidents, ‘just as we originally thought’.

In describing this internal bias, Professor James Reason rightfully points to ‘illusion of free will’ (Reason 2008). This deep and often erroneous belief in humans as free agents is often a core issue affecting our judgement ability. Believing that we are capable of guiding our own fate and exercising consistent correct actions towards achieving certain goals, despite an array of environmental, sociological and systemic influences around us, is the key barrier to our ability to see beyond human error.

As one senior leader put it to this author, “… we give workers systems, processes and good tools. We also tell them that it is ok to stop the job if something is unsafe. There is absolutely no excuse for doing the wrong thing and having an accident.”

If only safety and risk management was that simple. What we often believe is under complete and direct control of people, and the actual reality those people face, are two very different things. In the real world, work execution is much more complicated. Incompatible and conflicting goals, shifting priorities, work pressures, poor leadership and their unconscious messages are just some of many additional factors critically influencing operational execution of work.

Sadly, despite making progress in safety and risk across industries, the lack of quality leadership and understanding of human factors continues to dominate many boardrooms and executive levels. This further drives the tendency to allocate blame to those at the sharp end and allocate punishment for human errors instead of self-reflecting when looking for accident causes. One of the basic trademarks of good leadership is the ability to think along the lines ‘what could I have done differently to prevent this from happening and what do I need to do right now?’. Not surprisingly, this type of critical thinking remains as elusive as it is uncomfortable. It is unfortunately far easier and emotionally satisfying to stop at human error and discharge the accountability of senior decision-makers by sending good and competent people out of organisations, often for nothing more than being human. It may sound harsh but this is a cruel fact of life in many industries today.

Despite overwhelming opposing evidence from modern psychology, neurology and the study of human errors, the illusion of free will and tendency to find blame is so deeply ingrained in human decision-making it can be said that it has almost been normalised in many aspects of our lives. This is particularly visible in the legal system where the need for social justice often takes precedence over latest scientific research. It is also strongly represented in behaviour of various regulatory authorities and supported by media-driven blame in the reporting of major accidents and catastrophes. This is particularly well illustrated in the recent movie Sully, depicting the events surrounding US Airways flight 1549. In this event, the conscious and unconscious thinking from investigators and bureaucratic figures was to immediately default to illusion of free-will thinking and analyse human actions in hindsight and from the same perspective as if the human was a machine and had the luxury of time, ability and opportunity to rationalise all possible scenarios before supposedly acting ‘erroneously’. The movie has a happy ending for the pilot and his first officer; however, many other professionals across various different industries are not afforded the same wisdom and continue to suffer as the result of poor leadership and this archaic thinking.

It is simply staggering that despite so many scientific breakthroughs related to the power of unconscious thinking, especially in terms of speed, reliability and accuracy of unconscious and heuristic decision-making, we are still not adequately understanding, sharing, promoting and implementing knowledge of human factors in practice, on a mass scale. And let’s face it, if internally embedded safety professionals, especially those in very senior positions, don’t unlearn the ways of the past and learn how to think differently and promote this knowledge with senior leadership and management in organisations, how can we expect a change?

Developing a safety culture

As safety professionals, we need to acknowledge that our efforts in occupational health and safety are still mechanistic and focused on forcing people into predetermined operational and procedural envelopes under the cover of internal or external compliance. Some great examples aside, in many cases what starts as a compliance requirement often transpires in blind compliance, which creates a culture where people are discouraged to think for themselves, reluctant to engage and contribute, and are disempowered to make the right decisions and challenge work methods. Where this culture exists, a culture of safety and true continuous improvement cannot flourish, and operational as well as safety performance usually stagnates. Sure, leadership may be lacking in many regards, but there is no escaping the fact that safety professionals have a lot to answer for when it comes to this state of affairs, including the never-ending tendency to write ‘yet another procedure’, introduce another ‘blanket rule’ or ban yet another ‘dangerous’ tool.

Are we sure that we are promoting the right things, or are we suppressing the natural human ability of our workers to be safe and productive? Rather than police, suppress and blame, should we instead teach, mentor, empower, engage and coach?

People are not machines which need to be programmed or controlled but rather the opposite. People are solutions and agents of successful recovery, as proven time and time again in many real-life examples involving accidents and disasters. From the perspective of risk management in modern complex industries, the human ability to recover ‘out of control’ situations is far more profound than the inherited human propensity to err; however, the non-tangible nature of safety is such that the effects of human errors tend to be publicised much more readily than occurrences of the human ability to recover from seemingly impossible problems. No doubt human behaviour can certainly be a hazard; however, our continuous improvement in areas of safety and risk must be in further promotion of the human ability to be a solution so that we can harness that still largely untapped human potential. The benefits can be tremendous and the change is within our grasp; however, for this to occur an entirely different method of thinking is required in the way we see people and their actions, from the allocation of blame and connection of human actions at the sharp end to the behaviour and practices of senior leaders.

Instead of forcing people under mechanistic compliance and blaming or punishing for human errors, we need to enable creativity, foster collaboration with workers at the sharp end and develop flexible, error-tolerant systems capable of absorbing specific and identified types of human errors. Flexibility and error tolerance of organisational systems and processes is an integral and critical part of operational discipline and culture of safety. It is an absolutely essential element for achieving and sustaining a culture of resilience, especially in complex industries such as chemical, nuclear, aviation and others.

To further clarify, having the systems and rules is not bad, actually it is essential. However, those rules and prescribed requirements need to be balanced, meaningful and reasonable, as well as cater for human fallibility. They must be developed by people involved in the execution of work. Above all, they need to be flexible and able to be continuously modified as the team skills are enhanced or reduced and new error promoting conditions are discovered. For this culture of flexibility to exist, the culture of safety leadership needs to be very mature; senior leadership needs to be knowledgeable and abreast of the latest thinking in safety and risk.

Image credit: ©stock.adobe.com/au/Tom Wang

NSCA Foundation is a member based, non-profit organisation working together with members to improve workplace health and safety throughout Australia. For more information and membership details click here
Related Articles

Supporting the wellbeing of Australia's firefighters

Academics Dr DAVID LAWRENCE and WAVNE RIKKERS detail their continuing research in the area of...

Software-based COVID-19 controls help protect onsite workers

The solution decreases COVID-19-related risks by ensuring that contractors and visitors are...

Spatial distancing rules: are they insufficient for health workers?

Researchers have revealed that the recommended 1- to 2-metre spatial distancing rule may not be...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd