Aircraft facing low-visibility conditions have traditionally been dependent on ground-based navigational aids to guide them to a safe landing. Even then, there were limits on the visibility conditions under which pilots were allowed to land.
Georgia Tech Research Institute (GTRI) engineers are investigating the use of millimeter-wave imaging radars that would allow aircraft crews to generate a pilot-perspective image of a runway area even in zero-visibility conditions and without ground support. Such a radar could be combined with other sensors to provide a sensor suite that could help aircraft land in virtually any condition.
The Air Force wants to field an onboard system that allows aircraft to land in any type of weather condition, whether it be rain, fog, snow, a dust storm, day or night, says Byron Keel, a GTRI research scientist.
Called the Autonomous Approach and Landing Capability Program, the project is directed by the Air Force Research Laboratory at Wright-Patterson Air Force Base for the Air Mobility Command and is funded by the U.S. Transportation Command. GTRI is working collaboratively with BAE Systems, MMCOM Inc. and Goleta Engineering, and the Air Force Research Laboratory.
The U.S. Air Force is interested in autonomous-landing technology for several reasons. In Europe, where U.S. forces often prepare for deployment, dense fog conditions can prevent landings for days. Moreover, when U.S. planes land in primitive areas, they can face unpredictable landing conditions.
When radar senses a runway environment, what a layman might call distance from the airfield is measured in range. Width is associated with azimuth or cross-range, and height is associated with elevation.
GTRI began about two years ago to look for radar systems with the potential for supporting low-visibility landings. They found a BAE Systems Inc. experimental, two-dimensional system developed in the 1990s. It measured azimuth and range using millimeter-wave technology at 94 GHz, a frequency at which radar can see effectively through fog and dust.
The 2D system, however, does not measure elevation, a potential shortcoming. Pilots need accurate elevation measurements that represent elevated structures such as towers, buildings or trees on or near the approach path. Instead, the 2D system assumes that all objects lie on a flat earth, and derives a pseudo elevation based on range and aircraft height above the airfield.
If a pilot is coming in, its hard for him to tell if theres a building in front of him, Keel says. He has no real height information because that building in a two-dimensional system is projected onto the ground.
In trying to measure both azimuth and elevation, researchers face the problem that an aircraft has a limited area in which to place an antenna. A radars angular (i.e., azimuth or elevation) resolution is dependent on antenna size. Existing C-130 and C-17 transport aircraft have sufficient area to support the antennas horizontal dimension, but are significantly limited in the vertical dimension. Even if sufficient area were available, scanning-rate requirements limit a true pencil-beam approach.
To support elevation measurements, BAE Systems has developed a new approach that uses an interferometer to measure elevation. The company modified its experimental system, which had one transmit channel and one receive channel, and converted the single receiver channel into two receive channels.
A radar measures range by sending out a signal and measuring the time it takes for that signal to return from objects that it hits. The interferometer measures how long it takes a signal to return to both receiver channels. By comparing the difference between the two return times, the interferometer can estimate the elevation angle to objects in the runway area and along the glide slope.
GTRI has supported the Autonomous Approach and Landing Capability Program with extensive pretest analysis and test planning of BAE Systems new 3D hardware. Keel took part in non-flight testing of the new hardware at Wright Patterson in the winter and spring of 2005.
Initial test results were encouraging, Keel says. Still, he adds, researchers are busy enhancing the system with modifications to both the hardware and image processing algorithms. Flight tests of the radars effectiveness in low-visibility landings are planned for the latter part of 2006.
In addition to a radar system, Keel says, a full-fledged Autonomous Approach and Landing system might include a forward-looking infrared system (FLIR); light detection and ranging (LIDAR), a form of a laser radar; and perhaps even a radiometer, which could measure the temperature of ground objects.
Its really a suite of sensors that is being looked at, Keel says. Theres a larger program that the three-dimensional millimeter-wave radar system is feeding into.
The program is also considering whether synthetic aperture radar (SAR) could be useful to pilots landing in poor- or zero-visibility conditions. SAR is a method of generating high-resolution ground images and has already been used in such applications as earth-mapping and environmental monitoring.
Because radar resolution is limited by the small size of aircraft-based antennas, SAR gets around the size problem by generating a synthetic aperture that functions like a large, real-beam antenna.
An aircraft using SAR moves sideways or at an angle to the area it is imaging unlike a real-beam radar, which is used during a straight-on approach. By moving at an angle with respect to the scene, a SAR gathers many image slices and assembles them into a high-resolution image almost as if it were using a physically long antenna. Because this antenna-like effect yields high resolution, it could be used by approaching aircraft to make a detailed airfield image before landing.
A SAR produces an image with fine resolution in both range and cross-range for the purpose of identifying a particular target or in the case of a landing field, identifying debris or other objects on the runway that may pose a threat to a safe landing, Keel explains.