Psychology of Blame

In their 2007 book Yes! 50 secrets from the Science of Persuasion, the authors Goldstein, Martin and Cialdini, discuss 2004 research by social scientists Charles Naquin and Terri Kurtzberg. Naquin and Kurtzberg tested the reaction when technical failures and human error were separately identified as the cause of an incident.

The Research

In one case they showed research subjects a fake newspaper article (based on a real incident).  It reported on a train collision that injured scores of people.  Some participants were told a technical failure was the cause and others a driver error.  They repeated this approach during a real internet outage at their university.  This time they used questionnaires about the university IT department that either stated the failure was believed to be due to a computer fault or a human error.

In each case, the organisation involved (the train operator and the IT department) were considered more responsible if human error was thought to be the cause.  The researchers suggest that the perception of a human error provokes more thoughts of how a failure could be / should have been avoided than for technical fault, and so the failure seems worse.


In one sense this is bad news for contentious safety professionals.  It suggests that human nature means we are more aggrieved and less forgiving when a human makes an error than when a technical failure occurs.  However, technical failures are mostly linked to the specification, design or maintenance of the system and the associated human decisions and actions.  So it also reinforces the perception that it is easier to blame frontline operators rather than the managers or designers.  Their interpretation also suggests it is difficult to expect society to accept the concept of a just culture after a major accident.  It may help explain, for example, the extreme charges and rapid trial in South Korea after the Sewol ferry disaster.

However, one positive could be that it shows that swift public blaming of your own employees (as occurred after Costa Concordia or in a recent Spanish rail disaster) can only make things worse for the organisation (and not just because of the harm to your own safety culture).  A reason to avoid reliance on so called ‘just culpability tools’ and internal processes focused in judging frontline personnel rather than generating insight on necessary safety improvements.

Safety Resources

You may also find these Aerossurance articles of interest:


Aerossurance has extensive air safety, operations, SAR, airworthiness, human factors, aviation regulation and safety analysis experience.  For practical aviation advice you can trust, contact us at:

Follow us on LinkedIn for our latest updates.