Back to the Future: Error Management

A year ago we published our article James Reason’s 12 Principles of Error Management.  It set out set out the 12 systemic human factors centric principles of error management that James Reason, Professor Emeritus, University of Manchester defined in his book Managing Maintenance Error: A Practical Guide (co-written with Alan Hobbs and published in 2003).

managing risk organizational accidents Recently we spotted a photograph of some text in a Tweet.  It is from Reason’s earlier 1997 classic Managing the Risks of Organizational Accidents: james reason on em While mention of TQM looks rather dated now, sadly we do wonder how far we have collectively moved as an industry since then:

  • How many organisations still await occurrence or hazard reports from the front-line rather than conducting active oversight or encouraging safety improvement action?
  • How many organisations spend more time analysing individual behaviour after an occurrence to determine culpability than analysing the factors that affect individual performance before an occurrence?

We certainly see regular accidents reports that suggest we can do better:

  1. Fatal $16 Million Maintenance Errors
  2. Misassembled Anti-Torque Pedals Cause EC135 Accident
  3. The Missing Igniters: Fatigue & Management of Change Shortcomings
  4. A319 Double Cowling Loss and Fire – AAIB Report
  5. USAF RC-135V Rivet Joint Oxygen Fire
  6. Inadequately Secured Cargo Caused B747F Crash at Bagram, Afghanistan
  7. BA Changes Briefings, Simulator Training and Chart Provider After B747 Accident
  8. Gulfstream G-IV Take Off Accident & Human Factors
  9. Fatal G-IV Runway Excursion Accident in France – Lessons
  10. ‘Procedural Drift’: Lynx CFIT in Afghanistan
  11. Fatal Night-time UK AW139 Accident Highlights Business Aviation Safety Lessons
  12. Fatal Helicopter / Crane Collision – London Jan 2013
  13. Misloading Caused Fatal 2013 DHC-3 Accident
  14. Metro III Low-energy Rejected Landing and CFIT
  15. Operator & FAA Shortcomings in Alaskan B1900 Accident
  16. Culture + Non Compliance + Mechanical Failures = DC3 Accident
  17. Mid Air Collision Typhoon & Learjet 35
  18. Metro-North: Organisational Accidents
  19. DuPont Reputational Explosion
  20. Shell Moerdijk Explosion: “Failure to Learn”

Further, as a society, we still see human error being defined as a cause:

Of course we can take heart that many practitioners are making amazing strides in applying Reason’s 12 Principles, enhancing their organisation’s safety culture and looking at other ways to enhance human performance as we discussed here:

A follow up to the original book, entitled Organizational Accidents Revisited, is due to be published by Ashgate in January 2016 on the topic of what it abbreviates to ‘orgax’.  It is reported that:

Where the 1997 book focused largely upon the systemic factors underlying organizational accidents, this complementary follow-up goes beyond this to examine what can be done to improve the ‘error wisdom’ and risk awareness of those on the spot; they are often the last line of defence and so have the power to halt the accident trajectory before it can cause damage. The book concludes by advocating that system safety should require the integration of systemic factors (collective mindfulness) with individual mental skills (personal mindfulness). Contents:

  • Introduction.
  • Part 1 Refreshers: The ‘anatomy’ of an organizational accident; Error-enforcing conditions.
  • Part 2 Additions Since 1997: Safety management systems; Resident pathogens; Ten case studies of organizational accidents; Foresight training; Alternative views; Retrospect and prospect; Taking stock; Heroic recoveries.
  • Index

The 10 case studies are: Three from healthcare, two radiation releases, one rail accident, two hydrocarbon explosions and two air accidents.


Amy Edmonson discusses psychological safety and openness:

UPDATE: 28 August 2016: We look at an EU research project that recently investigated the concepts of organisational safety intelligence (the safety information available) and executive safety wisdom (in using that to make safety decisions) by interviewing 16 senior industry executives:  Safety Intelligence & Safety Wisdom.  They defined these as:

Safety Intelligence the various sources of quantitative information an organisation may use to identify and assess various threats. Safety Wisdom the judgement and decision-making of those in senior positions who must decide what to do to remain safe, and how they also use quantitative and qualitative information to support those decisions.

The topic of weak or ambiguous signals was discussed in this 2006 article: Facing Ambiguous Threats.  A paper by the Health and Safety Laboratory is worth attention: High Reliability Organisations [HROs] and Mindful Leadership and in a paper by Andrew Hopkins at the ANU.

UPDATE 16 February 2017: Aerossurance is delighted to be sponsoring an RAeS HFG:E conference at Cranfield University on 9 May 2017, on the topic of Staying Alert: Managing Fatigue in Maintenance.  This event will feature presentations and interactive workshop sessions.

UPDATE 1 March 2017: Safety Performance Listening and Learning – AEROSPACE March 2017

Organisations need to be confident that they are hearing all the safety concerns and observations of their workforce. They also need the assurance that their safety decisions are being actioned. The RAeS Human Factors Group: Engineering (HFG:E) set out to find out a way to check if organisations are truly listening and learning.

The result was a self-reflective approach to find ways to stimulate improvement.

UPDATE 22 March 2017: Which difference do you want to make through leadership? (a presentation based on the work of Jim Kouzes and Barry Posner).  Note slide 6 in particular:

leaders inspire trust

UPDATE 25 March 2017: In a commentary on the NHS annual staff survey, trust is emphasised again:

Developing a culture where quality and improvement are central to an organisation’s strategy requires high levels of trust, and trust that issues can be raised and dealt with as an opportunity for improvement. There is no doubt that without this learning culture, with trust as a central behaviour, errors and incidents will only increase.

UPDATE 4 August 2017: The US Air Force plans to “significantly reduce unnecessary Air Force instructions over the next 24 months“.

Air Force Secretary Heather Wilson

Air Force Secretary Heather Wilson

Secretary of the Air Force Heather Wilson said that “the 1,300 official instructions are often outdated and inconsistent, breeding cynicism when Airmen feel they cannot possibly follow every written rule”.

The effort will start with the 40 percent of instructions that are out of date and those identified by Airmen as top priorities.

“The first step will target immediate rescission,” Wilson said. “We want to significantly reduce the number of publications, and make sure the remaining ones are current and relevant.”

The second phase will be a review of all other directive publications issued by Headquarters Air Force. These publications contain more than 130,000 compliance items at the wing level.  Publications should add value, set policy and describe best practices, she said.

Wilson emphasises trust, trust in the judgement, experience and training of airmen, rather than prescribing everything.

Think about that.  There are 130,000 ways a ‘culpability’ or ‘accountability’ decision aid’ could be used, counter-productively, to judge 320,000 service personnel and 140,000 civilians.

One wonders how many were created by a lack of trust or due to practical drift (a concept discussed in Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq).  Despite self-serving nonsense pushed by some consultants who haven’t studied Snook, this practical drift is not about drifting from procedures as designed, but the continual addition of bureaucracy until the point the system becomes unworkable and a failure occurs.

Practical Drift (Credit: Col Scott Snook US Army retired)

Practical Drift (Credit: Col Scott Snook US Army Retired)


Aerossurance is pleased to be supporting the annual Chartered Institute of Ergonomics & Human Factors’ (CIEHFHuman Factors in Aviation Safety Conference for the third year running.  This year the conference takes place 13 to 14 November 2017 at the Hilton London Gatwick Airport, UK with the theme: How do we improve human performance in today’s aviation business?

ciehf 2017

Aerossurance is an Aberdeen based aviation consultancy.  For advice you can trust on practical and effective safety management and safety culture development, contact us at: enquiries@aerossurance.com