Learning from mistakes

The error culture is our partly hidden script for dealing with undesirable developments. This article uses the example of technical diving to show how the path to psychological safety can be designed. This may serve as a suggestion for the transfer into the operational practice.

Error culture is one of the big ongoing topics in risk management. Culture means you have to deal with ideas that are implicit and going on in the background. You see this, for example, in the different interpretation of safety protocols against the background of different national cultures, genders and age groups. Error culture is our partly hidden script for dealing with aberrations. This becomes strikingly clear if one takes a risky field, which is usually not or only semi-professionally managed, and analyses it. In this article this is "technical diving", i.e. diving beyond the limits of recreational diving: depths of more than 40 meters, dives in caves and wrecks over several hours with great demands on decompression when surfacing. A tiny mistake here inevitably has the most serious consequences for those involved. So how do technical divers implement the culture of error in their demanding practice?

The architecture of accidents

In technical diving, the circumstances of accidents are basically relatively clear. Everyone is only too familiar with the accident pyramid, which is also often used in diving accident reporting.

The pyramid makes it clear that behind every fatal accident there are many times more serious incidents and many times more medical treatments. And among these, 3000 assistance incidents, which are not normally recorded very well. The accident reports published annually by professional associations are attempts to raise risk awareness. But are they really used as a learning opportunity? S

If you look at the interpretations of accidents, you will find mostly trivial recommendations that do only limited justice to the actual development. For example, the frequently read recommendation "Never dive alone!" never addresses why divers were alone in the first place. And it's too easy to just invoke the rule but not get to the bottom of the causes. This is the case with any accident analysis: before it gets too hypothetical, accident analysts usually end their report because otherwise they could be accused of being unobjective. Incidentally, this has long since been solved in Kaizen and Lean Management by the "5-Why-Method", in which every undesirable development can be investigated with the fivefold question of why.

Also very well known is the so-called Swiss Cheese Model of error prevention by James Reason from the 1970s, which vividly illustrates the origin of errors. In abstract terms, mistakes always have a systemically complex background, but never one that can be traced in a linear fashion. Although there is a chain of errors, how the "latent error" was triggered usually remains in the dark (example: the 60-year-old diver who has had an accident and who has paid attention to everything except the fact that his physical fitness is no longer that of a 20-year-old).

Many domains are simply "not aviation".

Misguided developments slip through all protective barriers, so to speak. No checklist can stop them, no protocol can avoid them. But protective measures can increase the probability that the undesirable development does not march through. But protective measures are not free. At the very least, they cost time and attention. If mistakes happen and guilty parties are blamed, this invites to hide mistakes. Thus, the causes of errors are not noticed or are only treated superficially. This harms in the long run, because the mistakes happen again. In aviation or medicine, mistakes cause considerable damage. Therefore, it is all the more important there to reduce the number of errors as much as possible. This is attempted, among other things, with Crew Resource Management. In these training courses, for example, team building, situational awareness and stress management are strengthened. If you go diving in the technical field, you will never reach a safety standard like in professional aviation. And this is unfortunately true in many other operational environments (think theatre stages!). Wherever: it will always be a decision that also considers the economics of the checks.

This is where most checklists fail in the operational environment: they are too long and are conceivably unsuitable for effective and efficient use. A good checklist guides people through a complex problem together. A good checklist in technical diving would give guidance to check the diving system together. However, this is basically not implemented most of the time. The checklists that the author is aware of usually only contain checkpoints so that nothing is forgotten. This is necessary, but not sufficient.

Error culture - on the way to psychological safety

For the design of the error culture it is essential to deal with the error culture in the team in order to use it as a learning opportunity. To this end, it is important to consider the following:

  • Error culture is always implicit; it is important to personally address and discuss observed behaviour. This aspect is important to know because the multitude of discussions about (diving) accidents in the online media precisely cannot be specifically tied back to an implicit culture. It only makes sense, for example, if one takes an accident report and discusses it in one's own team, evaluates it and draws concrete consequences for one's own team.
  • Communities are the sources of latent error and latent safety: an error culture is tied to specific groups and teams and is dependent on the social context of the community. A high testosterone level and a high power distance will certainly make discussing mistakes difficult. Macho behavior often prevents a positive error culture. Who doesn't know the masculine specimens who turn bright red and seek their salvation in flight even when problematic behavior is addressed diplomatically?

Addressing mistakes within the team requires

  1. Courage: Especially very hierarchically structured environments do produce leaders (diving instructors) who do not let themselves be criticized by an employee (diving student) without further ado.
  2. Reflexivity: Mistakes are always systemic and thus a product of conditions. If, for example, someone has observed an error and only criticises it afterwards, the question arises as to why this was not noted earlier.
  3. Suitable environment: Criticism needs to be appropriate. It needs time and the appropriate moment. Particularly fundamental criticism will tend not to be voiced in a short debriefing session. And for fact-intensive discussions about the best subject contexts, the beer in the evening is conceivably unsuitable. A good time is probably somewhere in between, in a less hectic situation.
  4. An open future: If partners are only dependent on each other for a moment, it is normal that particularly profound criticism is absent. This changes if you want to continue to make progress together in the future. An open future is therefore necessary to increase the overall resilience in the team.
  • Listening to the inner voice: This applies not only to personal misbehavior subject to bad knowledge, but especially to team-group pressure. Incidentally, one only achieves "mindfulness" when the "voice" is carried together.
  • Constantly create opportunities to raise awareness: Distribute information that addresses the culture of error in the first place and that makes visible in the team who thinks what about which behavioral deviations. Systems immunize themselves very quickly against safety issues. Therefore, it is necessary to constantly look for different forms through which an exchange is possible, e.g. sometimes via printed material, sometimes in a workshop, sometimes in a joint exercise.

Once these last steps have been achieved and established in the team culture, Harvard professor Amy Edmondson speaks of the necessary "psychological safety" that makes joint learning and processing of the error culture possible. In this moral sense: Good luck in shaping the culture!

This article appears in an MQ series contributed by experts from the Risk Management Network. www.netzwerk-risikomanagement.ch

 

 

(Visited 917 times, 1 visits today)

More articles on the topic