Vision technologies do more than help pilots see through darkness and land in fog. They show views of the surrounding terrain and alert pilots to obstacles that otherwise could not be seen. Videos from live cameras and up-to-date databases brought into the cockpit and displayed for pilots in a variety of ways are increasing situational awareness and safety on the ground and in the air.
Real-time images are projected onto a head-up display (HUD) screen in front of the pilot, or onto a visor in front of the pilot’s eyes, in the case of head-mounted displays. While enhanced vision systems (EVS) provide a real-time video image of the surrounding terrain, synthetic vision systems (SVS) are generated from a three-dimensional (3-D) map database and a synchronized rendering of the terrain. When these two systems are brought together in a combined vision system (CVS), they provide unprecedented situational awareness in all phases of flight.
Vision technologies are mainly found on business jets, where U.S. Federal Aviation Administration (FAA) regulations have historically provided operational credits for enhanced vision to allow pilots to land when weather is below authorized visibility and low visibility is the limiting factor. But pending FAA regulations will allow landing credits on more types of technologies and at more airports, so their use is expected to grow.
More than 1,700 Kollsman EVS are currently fielded in air transport and business jet categories, on all Gulfstream and FedEx aircraft, according to Leanne Collazzo, vice president for commercial aviation at Elbit Systems of America. EVS, SVS and HUD systems from Rockwell Collins are certified on Embraer Legacy 450 and 500 mid-sized business jets, and SVS and EVS are integrated with HUD systems on Bombardier Global Express business jets, according to Peeter Sööt, principal marketing manager at Rockwell Collins.
Enhanced Vision Technologies
In enhanced vision, live video or images of an aircraft’s external surroundings come from digitally processed signals from cameras or sensors typically mounted outside, on the nose of the aircraft. Many types of sensors are available, but the most common use infrared (IR) wavelengths slightly longer than the visible light spectrum. IR sensors capture thermal or heat signatures of the surroundings, allowing pilots to see things undetectable by the unaided eye.
When the video or images are presented on a multifunction display on a cockpit dashboard, it is called an EVS and doesn’t qualify for landing credit. When video is displayed on a HUD flight guidance system or other transparent display in the pilot’s outside field of view, it’s known as an enhanced flight vision system (EFVS), and it can qualify, resulting in lower landing minimums. The new FAA final rule permits operators to use an EFVS in lieu of natural vision not only to descend below the decision altitude/decision height or minimum descent altitude but also to continue descending from 100 ft above the touchdown zone elevation to the runway for landing.
EVS display real-time surroundings as if in daylight, and depending on the environmental factors at the time, these systems can be used to see through fog, smoke, smog or haze. Collazzo says EVS is designed to assist pilots during landing and in low visibility operations. Systems are tuned to pick up approach lights and other lighting.
Some EVS cameras use a single IR sensor to detect small temperature differences and other radiation outside the visible spectrum. Others fuse multiple images from separate multi-spectral detectors into one package, with each detector performing a different role. “Rockwell’s EVS-3000 camera uses three detectors: short wave IR sensors tuned to airfield lighting beacons in the 1–2 micron wavelength band, long wave IR [sensors] to see airfield environments at night, and low visible and near IR [sensors to] detect non-IR signatures from the airfield,” says Carlo Tiana, vision systems architect at Rockwell Collins.
Synthetic Vision Technologies
SVS and synthetic vision guidance systems (SVGS) display artificially generated scenes of the exterior of aircraft incorporating data from 3-D mapping and other databases located at the global positioning system (GPS) location. SVS screens present consistent database-driven views at the location of the aircraft determined by GPS independent of the weather. Database updates are housed in cards that are placed in an aircraft’s avionics system like the updates for terrain awareness and warning systems and other similar systems. Because SVS depends on databases stored in the aircraft and not information sensed in real time, its use is limited by regulation below 150 ft.
According to Thea Feyereisen, engineering fellow for Flight Safety Systems in Honeywell’s Advanced Technology Group, Honeywell was the first to certify synthetic vision in Gulfstream aircraft in 2008, building on its creation of proprietary terrain databases for its enhanced ground-proximity warning systems (EGPWS). Today’s evolving databases include higher resolution views of terrain, airports, heliports, obstacles in the flight path and oil rigs.
Database sources vary depending on the supplier. Rockwell, for example, collects databases from a variety of sources and updates them on a regular basis: terrain databases from U.S. National Aeronautics and Space Administration Earth surveys, airport runway databases from publically available sources, and obstacle databases from aeronautical charts, Tiana says.
Rockwell’s SVS system is driven by avionics computing graphics generators. They feed the same information that goes to head-down displays into HUDs. Configuration of HUD guidance systems depends on the platform. For example, Boeing 787s have overhead glass display units and there are systems on 737s with their own computing features.
Technologies combining both EVS and SVS into a single display are referred to as CVS. Feyereisen says Honeywell’s CVS technologies can use EVS sensors from any manufacturer.
The original certifications for SVS addressed controlled flight into terrain (CFIT) accident prevention, Feyereisen says. “Over the last few years, the FAA Commercial Aviation Safety Team (CAST) studied accident trends and identified that synthetic vision provides enhanced aircraft state awareness. EGPWS became mandated on aircraft as a result of an FAA CAST committee looking into CFIT.”
The next largest human factors cause of commercial accidents was loss of control, involving either an aircraft’s attitude or its energy state. “Attitude, excessive bank, excessive pitch or energy, low speed accidents happen when a pilot doesn’t have external visual references. Analysis identified lack of visual reference as the theme in 17 out of 18 accidents that occurred over a decade. Synthetic vision was identified to be the number one technology mitigation strategy for all of those cases,” she says.
“Since then, Airbus, Boeing and Bombardier have become a lot more interested in synthetic vision because it provides a potential lifesaving technology to the cockpit. It is much easier for pilots to detect disturbances in attitude, and enhanced symbology helps pilots with energy,” she explains. “As more aircraft use synthetic vision, it will take the ‘loss of control’ bar down.”
“EVS gives great pictures for night visual meteorological conditions in good weather and lowers the visual decision height from 200 ft to 100 ft before seeing the external runway envelope with the naked eye. Properly equipped SVS guidance systems can go to 150 ft, but not yet 100 ft. Merging the technologies with CVS gives the best of both worlds,” says Feyereisen.
Lower landing credits are only given for EFVS, not for synthetic images alone, Collazzo says. “The lower landing credits extend capability to begin an approach. Safety comes in with situational awareness. Nothing is safer than seeing,” she says, and that is Kollsman’s tagline.
Tiana says each piece — the EVS, SVS and HUD — provides situational awareness to a pilot looking out the window. “All together, the ultimate goal is to create as close to a day, clear, ideal VFR [visual flight rules] condition as possible,” she says. “Situational awareness from EVS is also a great tool for making ground operations safer, helping pilots see things at night.”
Providing more accurate information to the pilot during critical stages or complex approaches enhances safety in several ways. The pilot can better manage surrounding environments, the traffic and the terrain, which may present a threat. The added information decreases the level of threat to the pilot, enhancing his or her ability to handle any event that may occur. Especially in bad weather or fog, a high level of stress, and difficult or special approaches, the extra information on eyes-out technologies could also increase a pilot’s ability to perform.
“There are powerful motivations to use head-up displays during poor weather conditions,” says Tiana. “They allow pilots to fly using their eyes, with instinctive use of their senses used every day rather than looking down at instruments to interpret complex systems on the aircraft.”
Rockwell Collins’ Head-up Guidance Systems (HGS) display flight path vectors, energy management capabilities with acceleration and speed error, and symbology specific for touchdown that enable pilots to see how much runway is remaining. Some systems also show runway perspectives, increasing pilot awareness in different phases of flight. Flight path information helps ensure stabilized approaches. Other symbols shows unusual attitude, wind shear, landing events, and tail strike prevention, all of which improve safety during operation.
SVS provides awareness of the terrain and airport domes, symbols that outline the location over the primary destination and alternate airports, making it easy for a pilot to see, from a safety standpoint, where the alternates are located. SVS also provides extended centerline information.
But Tiana says the operational benefits of lower approach minimums in low visibility also improve safety. “When flying heads up with HGS alone, pilots already have their eyes focused on the outside envelope. When combined with EVS, lowered decision heights increase the number of approaches that can be completed. Completing missions the first time and avoiding go-arounds is good for safety and the overall airspace,” he says.
“These technologies are useful in precision approaches, adding to the situational awareness of the sources to follow, but also in non-precision approach situations. The HGS provides the flight path information. Coupled with vision technology and daytime visual flight rules, a pilot would have the same situational awareness and ability to fly a stabilized approach if there is no outside guidance or signal,” Tiana says.
“Knowing where the plane is on the airfield or referencing its location on the runway with EVS on HUD could help prevent runway excursions, incursions and excursions involving runway remaining,” says Tiana. “It could prevent aircraft from hitting a vehicle or animal in the runway that [wasn’t] seen.”
“SVS and EVS together in terrain [awareness and] warning systems could provide enough awareness to prevent any type of flight into terrain. And HGS provides symbology to prevent tail strikes and alerts about different conditions causing unstable approaches,” Tiana says.
EVS could especially help prevent CFIT accidents in which weather is a contributing factor. Collazzo says customers with aircraft involved in several accidents adopted it because of the improved situational awareness.
“One part of accident avoidance is making it easier for pilots to fly and see, and the other part is ensuring the aircraft is where it is supposed to be, from gate to gate or spot to spot, whether landing, taking off, or taxiing. Fire trucks need to find a plane and there are requirements for airline ground operations as to how quickly they must arrive. The ability for vehicles to move on the ground is important, and higher resolution, better SVS databases and sensor augmentation will help with positioning,” she says.
Feyereisen says improved symbology makes it easier for pilots to fly. “Merging SVS and EVS symbols with those used in traditional HUDs is an important part for safety. Many energy management accidents involving pilots losing awareness of their energy situation on final approach — too slow or too low — could be prevented by bringing the symbols together, a tremendous safety improvement to the operation,” she says.
Most Recent Developments
Cameras used in EVS have matured over the years. Most hardware and software advancements improve the camera’s sensitivity to light and increase its resolution in seeing minute temperature differences.
The third-generation cameras used in current Elbit EVS have advanced image processing algorithms, more sensitive detectors, cryogenic cooling and highly transmissive sapphire windows heated to prevent icing, according to Collazzo.
Advancements in EVS camera cooling technologies improve reliability and operator efficiency, and don’t require cryo coolers on board. Rockwell recently obtained operational certification for its EVS, the first uncooled system in the industry. “The future is in uncooled systems,” Tiana says.
HUDs that fuse sensor data with guidance are becoming smaller, and according to Tiana, have the most potential for near-term advancement. The Embraer EFVS from Rockwell Collins is the first certified compact HUD to use waveguide technology without a separate projector unit so it can go into smaller cockpits.
The new TopMax head-worn display from Thales couples with EVS/SVS systems to give pilots unlimited field of view and live image refreshing as the head is turned. The pilot wears the small display in front of one eye, and it clamps onto any type of audio communication headset. The display uses optical, inertial gyro and accelerometer head tracking systems to precisely reference a pilot’s location inside the cockpit.
“Our example of sensor fusion displays primary flight information, surrounding 3-D traffic with flight number and distance to the aircraft, trajectories and terrain, depending on what databases, sensors and cameras are present in the aircraft,” says Richard Perrot, marketing director at Thales Avionics. “The biggest safety improvement introduced through this product is its ability for pilots to see critical flight information and surrounding terrain when looking in any direction, in bad weather conditions. Pilots can now look in any direction to monitor complex crosswind approaches off-axis, improving safety.”
“A new category of TFVS (Total Flight Vision Systems) may be required to encompass head-worn systems because they will eventually replace traditional HUDs. They are lightweight and have extremely small footprints,” says Perrot. “It is a breakthrough technology, reinventing safety for all pilots. It allows smaller planes access to benefits only available on larger commercial jets with sufficient head clearance for a HUD.” Although there are no aircraft operating with this technology today, Perrot says testing is under way, and he expects a launch to commercial markets in 2018.
Sensor fusion, merging information from SVS and EVS onto a single display system, is the biggest trend in the industry. Perrot says, “The combination captures more value, and brings it to the pilot for each stage of the flight, tracking, mapping, on the ground and in the air. Having all functionalities available, merged in a single system, gives better global safety by providing additional awareness.”
Collazzo agrees sensor fusion will help pilots, and future systems will help them see in any weather condition. “EVS works in fog, but it doesn’t help much in the thickest, marine-layer fog. Many manufacturers are focusing on fusing data from several different types of sensors working in different wave bands. Millimeter wave and other technologies to render images and systems with higher resolution and better abilities to penetrate different weather conditions will help in the future,” she says.
“EVS technologies that worked for many years in the military are finally becoming affordable, reliable, and viable for imaging in more weather conditions,” Tiana says. Feyereisen says there is considerable research activity in LIDAR (light detection and ranging technology, which uses laser light to measure distances), weather radar and other alternative technologies in the EVS industry. EVS manufacturers are also investigating using other types of sensors to help validate the navigational position and produce higher resolution signals.
Tiana says that EVS and SVS technologies “will move from the limited market of business jets more into airline and helicopter operations where safety needs are great. Helicopters in tricky search and rescue and medical evacuations could greatly benefit from adapted versions.”
As EVS and SVS technologies are enhanced and improved, lower minimum rulemaking is coming, Feyereisen says. “Currently there are no operational weather minimums for SVS for takeoff or landing, but it will soon come. … The next generation is about reducing investment in the ground infrastructure
Debbie Sniderman is CEO of VI Ventures, an engineering consulting company. She can be reached at email@example.com.
Featured image: © Rockwell Collins