The ‘Automation Problem’ – A Discussion
Automation, or perhaps more correctly the use of automated systems, has become a hot topic in the aviation industry, particularly since the issue of the accident report into the loss of AF447 (issued in July 2012). We review one analysis of the underlying cause and discuss a warning about an emerging threat.
Aerossurance has covered the outcome of investigations into fixed wing accidents such as the Asiana OZ214 B777 Accident at SFO 6 July 2013 and other serious incidents involving Boeing 737s, Embraer ERJ 170s and Airbus A320s.
Aerossurance has also discussed a Royal Aeronautical Society Rotorcraft Group conference in July 2014 on the introduction of automation to offshore helicopters, provocatively titled Technology: Friend or Foe?. The use of helicopter automation is only likely to increase (see: Bell 525 Fly-By-Wire Update).
We also covered the release of a series Crew Resource Management (CRM) videos from UK Civil Aviation Authority (CAA) including one that reconstructed of an actual incident where autopilot mode confusion during the glidescope capture results in loss of control a breakdown in collective situation awareness.
The ‘Automation Problem’: by Captain Ed Pooley
The 20th issue of Eurocontrol‘s Hindsight magazine focused on automation matters. Among the series of articles and case studies, we particularly recommend an article on The ‘Automation Problem’ by Ed Pooley. Pooley was Head of Safety for a large UK regional airline and is now a consultant and Chief Validation Adviser for SKYbrary.
Pooley comments that high levels of automation has had two main effects:
Pilots’ Knowledge of both their automated systems and the way they interact with how aircraft fly, however they are controlled, is often insufficient to cope with abnormal events unless these are resolved by straightforward checklist compliance.
The extent and nature of the Decision Making which is required to operate a highly automated aeroplane today is quite different from that required to fly most similar-sized aeroplanes thirty years ago.
Cockpit monitoring (a topic discussed by Aerossurance in August) and compliance with standard operating procedures are powerful controls that can mitigate risk of a range of accident types. Pooley however contends that these only treat the symptoms of the real automation problem, stating:
The focus needs to be placed firmly on effective knowledge-based decision making.
Pooley goes on to analyse the Air France AF447 A330 and Asiana OZ214 777 accidents. He also examines two more positive outcomes (the Qantas QF32 A380 and the less well know Cathay Pacific A330 serious incident were both engines started to malfunction after fuel contamination).
He concludes, perhaps in places a little controversially:
…whilst the way automation is delivered in aircraft design can always be improved, the root of the automation problem we are seeing today does not lie primarily – as many human factors experts will tell you – in system design. Rather, it lies in ensuring that people with the right aptitude and ability are trained as pilots in the first place. And that they are thereafter provided with type and recurrent training which is compatible with a job which now typically has very long periods of automated routine punctured only very rarely by the challenge of something completely) unexpected.
Even with the very best selection processes, a successful outcome to any path through training is not a guaranteed one. There is a very heavy responsibility on all aircraft operators to ensure that they do not release pilots to line flying duties until there is solid evidence that all aspects of their professional competence have been clearly demonstrated to be compatible with their role.
…on the evidence available, the industry as a whole and the regulatory system in particular can reasonably be characterised as having been sleepwalking towards the situation we are now in.
There has been a failure to realise that the undoubted safety benefits of automation needed a lot more attention to pilot qualification and pilot training than we have seen in all but a relatively few enlightened operators.
While studies indicate that automation has had a positive affected on safety (as detailed in this Aerossurance article: Airbus Report: Commercial Aviation Accidents 1958-2013 – A Statistical Analysis), Pooley warns that:
The consequences of the transition to automation have so far been masked by the broader experience which older pilots, especially those in command, have had.
It remains to be seen however if newer generation pilots, with appropriate training, will actually cope better with highly automated systems.
We found this to be a highly thought provoking article and welcome you views on LinkedIn.
Aerossurance is pleased to sponsor the Royal Aeronautical Society Rotorcraft Group 2016 conference on the important and highly topical subject of ‘Automation & Offshore Operations’, to be held 6-7 July 2016 at the RAeS HQ at 4 Hamilton Place, London.
UPDATE 18 September 2016: AAIB: Human Factors and the Identification of Flight Control Malfunctions
UPDATE 29 December 2016: CRJ-200 LOC-I Sweden 6 Jan 2016: SHK Investigation Results
UPDATE 9 January 2017: HeliOffshore have released a HeliOffshore Automation Guidance document and six videos to demonstrate the offshore helicopter industry’s recommended practice for the use of automation.
UPDATE 18 February 2018: Autopilot, Mind Wandering (MW), and the Out Of The Loop (OOTL) Performance Problem. According to researchers from ONERA and CNRS:
The OOTL phenomenon has been involved in many accidents in safety-critical industries, as demonstrated by papers and reports that we have reviewed. In the near future, the massive use of automation in everyday systems will reinforce this problem. MW may be closely related to OOTL—both involve removal from the task at hand, perception drop, and understanding problems. More importantly, their relation to vigilance decrement and working memory could be the heart of their interactions. Still, the exact causal link remains to be demonstrated. Far from being anecdotal, such a link would allow OOTL research to use theoretical and experimental understanding accumulated on MW. The large range of MW markers could be used to detect OOTL situations and help us to understand the underlying dynamics. On the other hand, designing systems capable of detecting and countering MW might highlight the reason why we all mind wander. Eventually, the expected outcome is a model of OOTL–MW interactions which could be integrated into autonomous systems.
The designer’s view…may be that the operator is unreliable and inefficient… so should be eliminated from the system. There are two ironies of this attitude.
One is that designer errors can be a major source of operating problems…
The second irony is that the designer who tries to eliminate the operator still leaves the operator to do the tasks which the designer cannot think how to automate…it means that the operator can be left with an arbitrary collection of tasks, and little thought may have been given to providing support for them.
This is discussed further in this 2012 paper: The ironies of automation … still going strong at 30?
UPDATE 16 December 2019: