Human Factors and Automation(Pilot/Computer Interface)

The proliferation of control modes

Many researchers believe that a very real problem in today's transports is "pilot/computer interface". A problem that is causing considerable uneasiness involves the proliferation of control modes in the newest systems, particularly when the airplane is under the control of the autopilot and the pilot crew is monitoring the performance of the autopilot and the airplane. Unfortunately, the training that the pilots received in this area has often been less than optimum, a training factor in the real world is the reality that if all of the available modes are covered in training, then the pilots are responsible for demonstrating knowledge and competence in all modes during the checkride. (Orlady, 19991)

Autopilot mode confusion

The increase in automation, and the pilot monitoring of the automated systems that is implied, has human factor ramifications and will continue to have them. Some researchers believe that further automation would have an adverse effect on the effectiveness of pilot monitoring, some of this because humans aren't good monitors of rare events, and monitoring can be a boring job especially for a long haul flight. In some cases pilots have wanted to remove just part of the automation and utilize the remaining features but are unable to do so because 'all or nothing' are the only options. This is probably a computer programming problem because computer programmers didn't know all of the intricacies of aircraft operation in real world, and there are numerous vertical modes, it is difficult at times for pilots to understand how the modes are functioning. Therefore, it is important to have pilots involved in the automation design. (Orlady, 19991)

Complacency or Overreliance on Automation

A very real problem involved with the almost complete automation present is pilot complacency and overreliance upon automation. This pilot response occurs in normal operations and also is reflected in the pilot's reliance on the system to automatically make the correct response during abnormal operations. Flight crews tend to rely upon the automation to the point that the normal checks that are inherent in good manual operations are sometimes disregarded. (Orlady, 19991) To overcome this problem, the design of automation has required:

  • To command effectively, the human operatior must be involved. It matters little whether the pilot completes his/her task by controlling the aircraft directly or simply manages the other human or machine resources that are being used. Human factor experts have known for years that very highly skilled personnel do not do a good job of monitoring for events that have a very low probability of occurence, those who maximize the use of technologically possible automation often miss this very critical point.
  • In order to be involved, the human operator must be informed. The information available has to include all of the data that is necessary to keep the piot actively involved in the operation. This must include the information required to keep the pilot fully informed regarding the state, progress, and intention of the system. Otherwise the human operator cannot hope to be meaningfully involved. A very important human factors issue is determining the form, time, and manner in which information is presented.
  • The pilot must be able to monitor the automated system. Automated systems are fallible and pilots are the last line of defense capable of controlling, or in some cases preventing, a system failure. Pilots must be able to monitor the system effectively and knowing how the system is planning to accomplish its task.
  • Automated systems must be predictable. An automated system can only be monitored effectively if it is predictable. Pilots must be trained for normal operation of each automated unit as well as its behaviour during any failure mdoes, so they can make manual corrections or stop the automation failure going further.
  • The automated systems must also be able to monitor the human operator and the human must be able to monitor the automatics. This emphasizes two real problems, first, humans are fallible and are not the perfect monitors because of human limitations. Secondly, even the highly capable computers available today can fail partially or completely and cannot anticipate all of the circumstances that might be encountered in a line operation. Therefore, the performance of the computers and the human operators must be monitored by each other. For example, the computers should be able to send warning signals when human operator has made an error, and at the same time, when the automation is making inccorect decisions, humans need to understand and be aware of it.
  • Each element of the system must have knowledge of the others intent. A very basic principle in cross-monitoring, which must be effective in achieving maximum safety, is that it can only be effective if the monitor knows what the system is trying to accomplish. This principle requires good communication between the pilot flying and pilot-not flying for it is virtually impossible to be sure of intent without effective communication.

(Orlady, 19991)

All the above have considered in the human-centered automation and design, to see its principles please click here 5 Types of Communication

Automation & The ‘Error’ Dilemma

A dilemma that is faced by the designer of an air transport aircraft is that both the human as well as the automated machine makes errors. Neither the human component not the machine is reliable in regards to the reduction of accidents and incidents in air transport as both error. When dealing with the issue of automation it is wrong to have the assumption that error is a separate issue as it’s possible that automation in itself does make some errors probable, as automation is not perfect. In this century air transport operations have increased to the point that the aviation systems can’t tolerate rare, inadvertent or even random operational errors without controlling their operational consequences1.

On occasion an increase in automation in aircraft system together with the computers that control the automation make mistakes, or the aircraft encounters a situation that the designer of the automated system failed to predict. The only resource that can deal with such an occurrence is the human component namely the; pilot and air traffic controller. This makes the human component the ultimate back-up. In an instance where an unforeseen mistake occurs the logical thing would be to design a system that is ,’error resistant’ so that it’s not possible for the human component to make a mistake or error with the system, or in the case where automation is present, preventing automation from making a mistake however this is not possible as no system can be error free1.

Dealing with the problem of no system being error free

  • To deal with this problem the automated system should be designed in such a way that it’s made, ‘error tolerant’. Having the system error tolerant means that inadvertent errors are able to

be detected and alleviated so that they are not critical.

  • Simplifying the system; simplifying the system will allow for unintentional errors to be detected and the appropriate action taken to correct the error.

Dealing with error due to operator entry into the system

The issue of safety due to automation that arises due to the pilot or controller making rare errors can be reduced by having two pilots in the cockpit who are well trained to monitor the automatics as well as monitoring each other’s operational performance during flight. This process of monitoring both the systems performance along with the pilots performance is further improved by the automated warning systems in the cockpit1.

Automated Warning Systems

Today aircrafts have 3 types of automated warning systems that aid the pilot as well as ATC;

1.Those that monitor the aircraft. For example; hydraulic system, pressurization system, electrical system, engine fuel/ oil conditions1.

2.Those that monitor environmental threats to the safety of the flight. For example; GPWSs (Ground Proximity Warning Systems), Wind Shear Avoidance System (WSASs),Traffic Alert & Avoidance Systems (TCASs), Enhanced Ground Proximity Warning Systems (EGPWS) and Minimum Safe Altitude Warning (MSAWs)1.

3.Those that ensure proper configuration of the aircraft by ensuring the aircraft is on its required phase or that it’s entering the correct phase of flight. For example;
landing gear warnings- This warning sound when the gear isn’t down when the throttles are closed for landing.

Flap position warnings- This warning sounds when the flaps aren’t in position for takeoff.

Automation, Training, and Manual skills

One of the great myths associated with increased automation is that automation has or will reduce training requirements, this is simply not true. Automation has created training requirements that add to the previous requirements. The skills and knowledge needed to take full advantage of increased automation must be added to the training curriculum. There's no doubt that automation make flight safer and more efficient, however, that does not mean today's pilots don't need all the old skills and knowledge, in fact, they need more. Manual skills must be a part of any recurrent or transition training and checking program in addition to the emphasis given to the proper use of the automatics. (Orlady, 19991)

There are many reasons for skill deteriotation, including such items as shceduling or even motivation, but in most manual sklls deteriotation cases, the problem is simply a lack of practice. For instance, a major international airline has reported that its long range fleet schedules permit only one and one/half takeoffs and landings per month per pilot. And sometimes they just let the automatics make the approach and landing simply because it is easier than doing it manually.

From all above, human or the flight crew is and will remain ultimately responsible for the flight safety and efficiency, and this role will never change regardless how advanced the automation may become, automation systems are only the tools designed and used by human operators to complete the tasks. And at the same time, human operators are also reponsible for the operation of all the automation systems they use. Therefore, automation will not fly the airplane instead of pilots, this is neither a feasible nor an acceptable possibility in the foreseeable future.

References
1. ORLADY, H.& ORLADY, L. (1999). Human Factors in Multi-Crew Flight Operations. Brookfield: Ashgate Publishing Ltd.

Want to know more?

Human Factors and Automation


Contributors to this page

Authors / Editors

Aimee_ZAimee_Z
Melanie Attan

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License