Dual-camera vision system designed to allow automatic landings at small airfields

RGB and thermal infrared cameras working in concert may be able to replace expensive radio navigation equipment that small airfields can't afford.

Diamond Da42 C2 Land Test Aircraft

A team of researchers from the Technical University of Munich (TUM) and Braunschweig (TUBS) have devised a way for aircraft to potentially make automatic landings at small airfields without needing ground-based systems that are normally present only at large, commercial airports.

An instrument landing system (ILS) uses radio waves to make measurements from electric beacons and give lateral and vertical guidance to pilots. These measurements are essential to make automatic landings possible. Therefore, automatic landings are impossible at smaller airfields that do not have an expensive ILS system installed.

The new "C2Land" system developed by TUM and TUBS in partnership with the German federal government, as described in the research paper titled “Linear Blend: Data Fusion in the Image Domain for Image-based Aircraft Positioning during Landing Approach,” uses a camera in the normal visible range and an infrared camera, paired with image-processing software designed to determine relative aircraft position to a runway, to effectively replace the information provided by ILS systems.

A forward-looking InfraTec VarioCam HD620 thermal infrared (TIR) camera with 640 x 480 resolution and 14 mm lens that streams 24-bit floating point temperature measurements, and a Photonfocus MV1-D1312C RGB camera with 1280 x 1024 resolution and a 9 mm focal length wide angle lens, are mounted underneath the front section of the fuselage of the test aircraft, a Dornier Do 128-6 and a Diamond DA42.Da2 Test Aircraft

The basic principle of the C2Land system is to pass the RGB and TIR images through a PC that passes both images through computer vision algorithms for runway detection and positioning. The images are individually synchronized with Global Navigational Satellite System (GNSS) based timing information, which allows for precise monitoring of GPS/Space Based Augmentation System (SBAS) solutions generated by the aircraft in flight.

When the runway is detected, the distance to the runway is calculated and this 3D data is fed to a navigation server that processes GPS/SBAS signals and calculates a landing solution. The C2Land system then monitors that landing solution. If the discrepancy between the two positioning solutions is too high, the system warns the pilot, aborts the landing, and automatically increase the aircraft’s altitude to set up for another landing approach.

Because the C2Land system uses optical runway detection, it could potentially serve to also monitor and verify ILS systems if the C2Land system can maintain visual contact with the runway.

Testing at night or in severely foggy weather has not yet been possible due to flight restrictions for the test aircraft. However, the researchers believe that the infrared camera, with restricted ROI around the runway, could be used to highlight small temperature changes between runway and surrounding soil via a temperature mapping algorithm created by the researchers. This could allow the C2Land system to function during low-visibility conditions.

At night, however, when temperature differences between runway and soil would have evened out, using the C2Land system would be more challenging, say the researchers.

Related stories:

LiDAR-based vehicle detection and classification system operates in free-flowing traffic

Cornell University study suggests stereo camera systems could replace LiDAR

Russian project to develop fleet of self-driving delivery trucks is underway

Share your vision-related news by contacting Dennis Scimeca, Associate Editor, Vision Systems Design

To receive news like this in your inbox, 
click here.

More in Unmanned