Psychology of Blame
In their 2007 book Yes! 50 secrets from the Science of Persuasion, the authors Goldstein, Martin and Cialdini, discuss 2004 research by social scientists Charles Naquin and Terri Kurtzberg. Naquin and Kurtzberg tested the reaction when technical failures and human error were separately identified as the cause of an incident.
In one case they showed research subjects a fake newspaper article (based on a real incident). It reported on a train collision that injured scores of people. Some participants were told a technical failure was the cause and others a driver error. They repeated this approach during a real internet outage at their university. This time they used questionnaires about the university IT department that either stated the failure was believed to be due to a computer fault or a human error.
In each case, the organisation involved (the train operator and the IT department) were considered more responsible if human error was thought to be the cause. The researchers suggest that the perception of a human error provokes more thoughts of how a failure could be / should have been avoided than for technical fault, and so the failure seems worse.
In one sense this is bad news for contentious safety professionals. It suggests that human nature means we are more aggrieved and less forgiving when a human makes an error than when a technical failure occurs. However, technical failures are mostly linked to the specification, design or maintenance of the system and the associated human decisions and actions. So it also reinforces the perception that it is easier to blame frontline operators rather than the managers or designers. Their interpretation also suggests it is difficult to expect society to accept the concept of a just culture after a major accident. It may help explain, for example, the extreme charges and rapid trial in South Korea after the Sewol ferry disaster.
However, one positive could be that it shows that swift public blaming of your own employees (as occurred after Costa Concordia or in a recent Spanish rail disaster) can only make things worse for the organisation (and not just because of the harm to your own safety culture). A reason to avoid reliance on so called ‘just culpability tools’ and internal processes focused in judging frontline personnel rather than generating insight on necessary safety improvements.
You may also find these Aerossurance articles of interest:
- How To Develop Your Organisation’s Safety Culture
- James Reason’s 12 Principles of Error Management
- What Lies Beneath: The Scope of Safety Investigations
- Airworthiness Matters: Next Generation Maintenance Human Factors
- Aircraft Maintenance: Going for Gold?
- B1900D Emergency Landing: Maintenance Standards & Practices
- Meeting Your Waterloo: Competence Assessment and Remembering the Lessons of Past Accidents
- Safety Performance Listening and Learning – AEROSPACE March 2017
- Learning from Adverse Events: Includes nine principles for incorporating human factors into learning investigations.