Affiliation:
1. San Jose State University, San Jose, CA, USA
Abstract
While there is value in thinking about forms of automation failure, the primary role of human factors is to specify how the interface conveys automation state and behavior: when the automation fails and when the automation “performs as designed” but still results in safety events. How do we make automation an effective crewmember? I describe what the flightcrew, ideally, should understand about autoflight state or behavior, using aviation examples to illustrate how that information was not successfully conveyed to the flightcrew, leaving the flightcrew with an incomplete and/or incorrect understanding. Specifically, the interface needs to address automation state, its current targets or objectives, its limitations in achieving those targets, whether it is approaching an operational boundary, data validity, broader checks on what is operationally reasonable, and how to intervene.
Reference14 articles.
1. Cameroon Civil Aviation Authority. (2010). Technical investigation into the accident of the B737-800, Registration 5Y0KYA. Kenya Airways, Douala, May 5, 2007. https://web.archive.org/web/20160120114839/http://www.caa.co.za/Foreign%20Accidents%20and%20Incidents%20Reports/Cameroon%20Accident.pdf
2. Information Organization in the Airline Cockpit
3. Plan B for Eliminating Mode Confusion: An Interpreter Display