AVweb’s Glenn Pew interviewed Embry-Riddle professor and former Northwest captain Jack Panosian in a podcast entitled “Avionics — Good Pilots Not Required?”. It’s an inflammatory title, no doubt to encourage people to dive for that “play” button. Obviously it worked, because I listened to the whole thing.
Panosian has an impressive resume: 20 years at Northwest, 5 years at ERAU, and he’s got a Juris Doctorate as well. Nevertheless, while I agreed with some of what he said, certain portions of his thesis seem way off base.
I’ll summarize his points:
- automation used to monitor human pilots, but today it’s the other way around: we are monitoring the computers these days, and we’re not very good at it
- computers are good monitors, they do it the same way every time, with the same level of diligence
- stick & rudder skills are less important than avionics management skill and we need to teach with that in mind
The first two points may be correct (I’ll get to the third one later), but computers don’t “monitor”, they simply execute programming. There’s a big difference there. It’s true that when people monitor the same thing over and over again, we cannot maintain the same vigilance ad nauseum. But when humans monitor something, they’re capable of doing so with thoughtful and reasoned analysis. Humans can think outside the box. They can adapt and prioritize based on what’s actually happening rather than being limited by their programming.
Computers are not capable of that. Remember, system failures are not always covered by the aircraft operating procedures or training, and that’s why safe flight still requires human input and oversight. We are also capable of putting more focus on our monitoring during critical phases of flight. For example, I watch airspeed and flight path with much greater attention during approach than I typically will during cruise.
It’s also worth considering that, despite all the automation, humans still manually perform the takeoff, landing, taxi phases, as well as fly the airplane when the computers get confused or take the day off. These are the areas where most accidents happen. Air France 447 stalled up in the flight levels and remained in that state until reaching the ocean. Colgan 3407 was another stall accident. Asiana 214 was a visual approach gone wrong. Better manual flying skill might very well have made the difference in at least some of these accidents.
Glenn Pew asked, “How much of flying the airplane is flying the avionics?”, and Panosian replied that “the greatest innovation was the moving map”, giving an example of synthetic vision showing terrain at night. In my experience, a moving map is no guarantee of situational awareness. I’ve trained many pilots to fly VFR and IFR in glass panel Cirruses, DiamondStars, experimentals, and so on. I can’t tell you how many of them had no idea where they were, even with a 10″ full color moving map directly in front of them. When asked the simple question, “Where are we right now?”, you’d be surprised how many have a tough time coming up with an answer.
Does that seem odd to you? It shouldn’t. Situational awareness is not about the map in front of your eyes, it’s about the moving map inside your head. If you want evidence of that, look at the 2007 CFIT crash of a CAP Flight 2793, a C-182T Skylane which ran into high terrain near Las Vegas. That flight was piloted by two highly experienced pilots who were familiar with the area, had a G1000 panel in front of them, and still managed to fly into Mt. Potosi.
Panosian made the point that the Airbus was designed to be flown on autopilot “all the time — it was not designed to be flown by hand. It was designed so that it’s a hassle to be flown by hand”. Some business jets have similar characteristics. Who would want to hand fly the airplane straight and level for hours on end anyway? The light GA arena has an equivalent as well, the Cirrus SR20 and SR22. I enjoy hand flying them, actually, but the airplane has a somewhat artificial feel due to the springs in the flight control system. It was purposefully designed to fly long distances on autopilot. It’s very good at that mission. It’s well equipped, and has plenty of safety equipment aboard. TAWS, traffic, CAPS, a solid autopilot, good avionics… and yet the Cirrus’s accident rate is not better than average.
I don’t believe the answer is to make the pilot a better manager of automation. This will not stop CFIT, stall/spin, weather, and takeoff or landing accidents.
“The Good news is that we have a generation of pilots that have grown up with this technology, these tablets, etc. and they grab hold of these things better than the older pilot who was trained on the round dials. That’s a good thing because now you’re just molding them into the aviation world and this is how you’ll operate the aircraft.”
I’m a big proponent of glass panels, tablets, and technology. They’re great. But they do not make one a good pilot. If you want a better pilot, start primary students off in a tailwheel airplane and ensure they know how to fly before doing anything else. Everything should flow out of that. I wouldn’t expect this to be a revolutionary idea, but perhaps it is.
“You are not going to be hired because of your stick and rudder skills. You will be hired because of your management skills.”
A good aviator needs both sets of skills. Management ability is important, but no more so than stick-and-rudder capability. If you can’t physically fly the airplane during any or all phases of flight, you don’t belong in the cockpit because any equipment issues during those phases can leave the aircraft without someone capable of safely operating it. Pilots who can’t proficiently hand-fly are passengers. Console operators. Button pushers. System monitors (dog not included). But they’re not pilots.
“In other words, can you manage all these systems, can you manages the information you’re getting and make sure that the airplane is doing what it’s supposed to do? The fact of the matter is that we’ve see this in other industries. It’s hardly unique to the airline industry. A robot can do a better job of welding than a human. An autopilot has many more sensors than a human hand does. They can be done better and safer than a human being, but they must be monitored properly. That’s where the training comes in. We have to change from the stick & rudder skills to the manager skills. That’s what we’re trying to do.”
The problem with his comparison is that flying an airplane is not like welding. Welding does not require you to manage the energy state of a large chunk of metal hurling through the air while maintaining situational awareness, staying ahead of the aircraft mentally, and adjusting for countless variables ranging from weather to traffic to equipment failures to controllers, often all at the same time and at the end of a long work day. Doing all those things does constitute “management”, but I don’t think it’s the kind Mr. Panosian is referring to.
And as far as the autopilot is concerned, it’s extraordinarily simplistic to compare a full autopilot system to a single human hand. What about the rest of the body? What about the vestibular labyrinthine system and resultant equilibrioception? There’s proprioception, thermoception, etc. (Look ’em up — I had to!). And that’s to say nothing of our sense of sight, hearing, touch, and smell. We use those when we fly, even without direct knowledge of what our body is doing. How many times have you noticed a subtle vibration from a prop or engine, the sound of a leaking seal around a door, the sense of something just not being quite right?
Autopilots do some things better than a human. Automation is helpful and absolutely has it’s place. But it is no substitute for a flesh-and-blood pilot who knows how to fly the machine.
What say you, readers?
This article first appeared on the AOPA Opinion Leaders blog at http://blog.aopa.org/opinionleaders/2013/09/11/stick-and-rudder/.
Ron, you write the way I think. In fact if I was to blog on this subject it would be difficult to tell whether you wrote it or if I did.
I couldn’t agree more. Automation could be great for the airlines, but they have a long way to go before it’s safer than a human pilot. In a normal flight regime, it’s easier and more efficient and more consistent to let some sensors attached to some chips attached to some servo’s keep the plane right-side up. The problem is that the automation technology is created by fallible humans. Like almost all human mechanical creations, they require time to evolve and “debug.” Unfortunately, in aviation this process costs lives. And always has. You are young, and I suspect like many young people, you have the basic feeling that somebody, somewhere “has got it covered.” The fact is that you’re tooling along in thin air with nothing but your “faith” between you and death. More management skills training? Absolutely. More stick and rudder skills training? You better believe it! And I might add, more people such as yourself thinking and writing about this subject. Only through public discourse (and a bit of inspiration) can important issues like this be safely resolved. Thanks, as always, for your insight and efforts. –Bob
Thanks, Bob. I’ve started to see quite a few people writing about this stuff. Incidentally, I just happened to catch up on back issues of the NASA Callback newsletter, and issue #404 for September 2013 was focused on the topic of automation problems in the cockpit.
Ron, Thanks for posting the link. Some interesting but not surprising reading. Maybe we need another layer of machines that monitor the auto flying machines which humans can than monitor. Or we could go back to human pilots, but with machine monitors that can administer a low voltage electric shock when pilots get sleepy, or distracted, or complacent, or chatty, or cocky, or stupid, or… human… š
I can just hear a synthesized, gentle male voice in the cockpit saying: Dave… Dave… you have exceeded your allotted oral-metric units. Expect unit restoration in zero two zero minutes… continued speech is not authorized…repeat speech is not authorized at this time…Dave…
GA is now in the same position as the airlines were back in 80’s when automation started taking over the cockpit. I was weaned on round dial airplanes both in the airlines and military but had to transition to flat panel, flight guidance and FMS as did most aniline folks. The learning curve was steep and we soon learned that full up automation is fine when all is normal. As abnormal situations occurred the need to turn off portions of the automation was prudent. An extreme example of totally getting rid of the automation would be a response to an GPWS warning. Now as a FAA Flight examiner very often giving check rides in G1000 equipped aircraft, I certainly watch for lack of basic stick and rudder skills and an over dependency on the automation. Yes even at the Private Pilot level.