I’ve come to believe that single-pilot IFR in a fully loaded, glass-cockpit—but without using an autopilot—can be the toughest IFR flying you can do.
The past month has found me in the clouds with and without students in a couple different airframes that I hadn’t recently flown. That’s important because I wasn’t “in the groove” with known power settings and trim. There’s more load on the scan when the plane is less familiar and that’s where glass shows its biggest weakness: Visual channel overload.
Hand-flying an aircraft requires a scan that soaks up somewhere north of 50 percent of your awareness. Straight and level in a craft you’re familiar with might only take 30 percent of your attention; a climbing turn in turbulence in an unfamiliar bird might take all of it. But overall, half or two-thirds of your brain needs to be doing some flight-instrument scanning to fly with precision.
When the only other stuff to look at are a couple VOR needles, a folded chart and, maybe, a DME distance, the remaining 50 percent is more than ample. The only time it becomes a problem is if you get really preoccupied with something on that chart or the turbulence is so persistent it takes continuous corrections to stay upright.
But part of that is because much of the information in a legacy cockpit is auditory; it’s coming in on a different mental channel than the eyes. Weather via ASOS, ATC or Flight Watch is by listening. Idents come from sound. Even the old-school version of the moving map comes from sound: We’d listen to radio traffic and picture where people were and what to expect. The approach clearance, “Five miles from CHENY, fly heading 260 …” came in through the ears and was constructed into a mental map. Now it’s a prompt to look at the GPS or MFD.
I’m not asking to give up my glass. I’ve watched a couple students in recent, consecutive days catch and save what could have been disastrous errors (had they happened outside of training) by looking at their moving maps and GPS bearings. But I also think we’ve got to come up with some cockpit systems to shift some data from the eyes to the ears.
The Cirrus I sometimes fly calls out “500” when the TAWS system sees you 500 feet above touchdown, and the G1000 systems will call minimums for you on an approach. But how about something that reads that datalink weather to you? How about a smart GPS and approach system that sees you turning on the intermediate leg and calls out the leg altitude and heading?
Make all the “Bitchin’ Betty” and “I’m sorry, Dave, I’m afraid I can’t do that,” comments you want, but then think about the “voice of your CFI” getting you through a tough spot. There’s room for some on-demand voice inside those fancy, ANR headsets. It’s a way to get data into our brains without overloading our eyes. Or, in that CFI voice way, reminding us where to look when we do get a moment. If I get to try some of the early attempts that are now in the works, I’ll let you know.
Until then, I’ll keep these overloaded eyes moving when I’m flying glass—and lean on the autopilot. —Jeff Van West