May 2019 snapshots: Machine learning system assists animal conservation efforts, image sensors for detecting exoplanets, fully-automated test processes for aircraft cockpits, and hyperspectral instruments to study climate change.

May 1, 2019

Lufthansa Technik develops fully-automated cockpit control test system

Lufthansa Technik (Hamburg, Germany; www.lufthansa-technik.com), a provider of aircraft maintenance, repair, overhaul, and modification services, has developed a fully-automated test procedure for cockpit electronics testing. The Robot Controlled Cockpit Electronics Testing, or RoCCET test procedure, uses a fully-automated, custom-designed robot to check the functionality of cockpit LED lights and switches, comparing results to standardized measurement data to ensure compliance with uniform standards and determine when replacements are needed.

Integrated sensors measure force when switches are activated, and cameras from IDS Imaging Development Systems (Obersulm, Germany; www.ids-imaging.com) and SVS-Vistek (Seefeld, Germany; www.svs-vistek.com) are used to capture display instruments, perform checks for damage, and measure display brightness from multiple angles. The RoCCET system utilizes Vision Development Module software from National Instruments (Austin, TX, USA; www.ni.com) to process cameradata.

According to Florian Sell, Senior Engineer Automated Equipment Systems at Lufthansa Technik’s Aircraft Components Services division, the RoCCET test procedure will reduce testing times by one to two hours per component.

“At the same time, the new procedure provides concrete measurement data in accordance with uniform standards,” says Sell. “For example, we now have physical threshold values for the brightness of LEDs. And with the help of data mining, we can determine exactly when an LED has to be replaced.”

Lufthansa Technik has conducted research into automated aircraft maintenance at the Institute for Aircraft Production Technology at Hamburg University of Technology (Hamburg, Germany; www.tuhh.edu) and found past success with several robot-based inspection tools. AutoInspect, developed in 2015, is a robot that digitally inspects engine components for cracks using white light interferometry, processing 130 Gb of raw data with every scan of a combustion chamber. In 2016, the company created an automated process chain for engine component repair called AutoRep. Both systems, working in tandem, went into operative use in mid-2016, according to the company, and in mid-2018 AutoInspect and AutoRep were scheduled to be integrated into a single process chain.

The startup 3D.aero, a company founded to research and develop automation solutions for the aviation industry, is a joint venture between Lufthansa Technik and Pepperl+Fuchs (Mannheim, Germany; www.pepperl-fuchs.com), a developer of industrial sensor and process automation technology.

RoCCET is currently in the integration phase and will initially be used for cockpit controls on the Airbus (Leiden, Netherlands; www.airbus.com) A320 and A350, and Boeing (Chicago, IL, USA; www.boeing.com) 787 aircraft. Lufthansa Technik says that in the future, the RoCCET system may potentially be extended to other cockpit and cabin controls on all aircraft types.

Drone-based system uses machine learning and infrared cameras to successfully identify koala populations

A group of scientists from the Queensland University of Technology (Queensland, Australia; www.qut.edu) have developed a system to detect koala populations using algorithmic analysis of infrared footage captured by drone flights.

Koalas are listed as vulnerable, not endangered, in Australia. However, the eucalyptus forests that are the koalas’ primary habitat are threatened by the spread of agriculture and urban construction. Monitoring koala populations is therefore important for conservationists and the task is difficult because koalas inhabit a wide area and live in environments covered by forest canopies. In their paper, “Automated detection of koalas using low-level aerial surveillance and machine learning,” published on March 1, 2019 (http://bit.ly/VSD-KOA) the scientists on the project cite that only 60%-75% of koalas present within a survey area are detected through ground observation and photoimaging.

Enter the drones, infrared cameras, and machine learning software. For their study, the scientists used FLIR (Wilsonville, OR, USA; www.flir.com) Tau 2 640 thermal cameras with a resolution of 640 x 512, 13 mm focus length lens, and 9 Hz frame rate, mounted onto a DJI (Shenzhen, China; www.dji.com) Matrice 600 Pro drone equipped with the A3 Pro flight controller. Thermoviewer was used to process the drone footage and the Faster-RCNN and YOLO object detection deep convolutional neural networks (DNNs) were used to detect the koalas.

There are a number of factors around the drones/sensor, but as a university with a number of different applications the FLIR [camera] seemed most versatile, and the 600 Pro had good flight times and capacity to carry reasonable loads,” says Dr. Grant Hamilton, one of the authors of the study.

Each DNN was fed previously-captured thermal data from drone flights where the koala locations had been confirmed via radio tracking and manual inspection. This data allowed the DNNs to generate models by which to identify features in analyzed footage as potential koalas. The models were applied to test footage and the results were manually corrected via identifying the locations of koalas present in the footage but not detected by the models and by marking false detections as negative results.

Eucalyptus forests in north and south Petrie Mill, Queensland were chosen as the experiment sites, with a known population of 48 koalas that were surveyed by a ground team on the same day as the drone flights would take place, to determine the validity of the identifications made by the DNNs. Eleven drone flights were conducted between February and August 2018.

The DNNs separately drew heat maps of potential koala locations and the data was compared to determine consistent results between the two networks. The ORB (Oriented FAST and Rotated Brief) algorithm developed by OpenCV labs was used to assist in this process by accounting for change in images resulting from movement of the drone. If potential koala heat signatures were consistent over enough concurrent frames the signatures were accepted as detections. The data was then manually reviewed.

The automated, DNN-driven detection method had an 87% overall probability of successfully identifying a koala from the infrared footage, whereas manual inspection of the thermal data resulted in a 63% probability. The automated system required an average of 136 minutes to process the thermal imaging and identify potential koalas. An average of 170 minutes was required to manually sift through the data and identify probable koala locations.

The scientists posit that if the DNNs had been trained appropriately, that the procedure could be used to identify multiple types of animals from a single batch of thermal footage, and that use of the method to detect other types of animals would help validate the results of the koala test.

New telescope to search for Earth-like planets orbiting stars near the Sun

Researchers from the Astrobiology Center (Mitaka, Tokyo; http://abc-nins.jp/index_en.html), the University of Tokyo (Bunkyo, Tokyo; www.u-tokyo.ac.jp/en/index.html), and the Instituto de Astrofísica de Canarias (Canary Islands, Spain; www.iac.es), have developed the MuSCAT2 second-generation 1.52 meter Multicolor Simultaneous Camera, to study the atmospheres of exoplanets, or worlds beyond our solar system, with the hope of discovering Earth-like planets orbiting stars near the Sun.

The MuSCAT2, deployed at the Teide Observatory in Tenerife, Spain, utilizes four PIXIS back-illuminated 1024 x 1024 pixel CCD cameras, developed by Teledyne Princeton Instruments (Trenton, NJ, USA; www.princetoninstruments.com), to simultaneously image in the 400 to 550 nm, 550 to 700 nm, 700 to 820 nm, and 820 to 920 nm bands. One of the four cameras used is a PIXIS 1024B. The other three cameras are 1024B eXcelon models. Each camera is independently controlled by a PC, which allows observers to set different exposure times for each camera or to set synchronizedexposures.

The MuSCAT2 uses three dichroic mirrors-filters that allow a specific range of color to pass while reflecting other colors-from Asahi Spectra Co. (Tokyo, Japan; www.asahi-spectra.com) to separate light into four wavelength channels. The camera also utilizes bandpass filters developed by Astrodon Photometrics (Rancho Cordova, CA, USA; https://astrodon.com/) and a custom-ordered bandpass filter from Asahi Spectra Co. The three-color simultaneous imaging camera MuSCAT, deployed at the National Astronomical Observatory of Japan, was capable of probing atmospheres of super-Earth/mini-Neptune-sized planets orbiting a nearby M dwarf star. The researchers expect that MuSCAT2 should have similar or better capabilities.

MuSCAT2 should be particularly useful in collaboration with NASA’s Transiting Exoplanet Survey Satellite (TESS) launched in April 2018. Scientists discover exoplanets by observing planets as they pass in front of their host stars and block the light. Scientists can investigate the true mass, radius, density, orbital obliquity-the angle between the planet’s rotational axis and orbital axis-and atmosphere, of these planets.

Eclipsing binary stars—a pair of stars that orbit each other—also can block the other’s light during their orbits and thus create false positive results in the search for exoplanets. NASA predicts that the false positive rate for the TESS mission caused by eclipsing binaries will be between 30-70%, based on the direction in which the stars were observed.

A four-color simultaneous telescope like MuSCAT2 can help detect these false positive results by observing the change in color as an object passes in front of a star. For a binary star system, the color of the light coming from the system changes while it dims. When an exoplanet passes in front of a star, however, the color remains the same as it dims.

The MuSCAT2 went into service on the night of August 23, 2017. Science operations began in January 2018 and included 250 telescope nights that year. MuSCAT2 is expected to serve more than 162 nights per year, until 2022. In addition to supporting TESS operations, the MuSCAT2 would also be appropriate to support the European Space Agency’s PLAnetary Transits and Oscillations of stars (PLATO) mission that is planned for launch around 2026.

Hyperspectral instrument to observe Earth for PRISMA mission

For the Agenzia Spaziale Italiana’s (Rome, Italy; www.asi.it/en) PRISMA mission to be a success, the namesake satellite required hyperspectral imagers, and for the designer of the hyperspectral imaging instrument to successfully complete the task, they required custom-designed infrared detectors. Infrared imaging equipment developer Sofradir (Palaiseau, France; www.sofradir.com) was happy to oblige.

PRISMA, or PRecursore IperSpettrale della Missione Applicativa (Hyperspectral Precursor of the Application Mission) is an Earth-observation satellite intended to cover Europe and the Mediterranean region, and a platform for demonstrating new technologies. The five-year observation mission is designed to derive information about land cover, agriculture landscape, pollution, the quality of inland waters, status of coastal zones, soil mixture, and carbon cycle. The satellite may also have national securityapplications.

The PRISMA satellite, engineered by OHB Italia (Milano, Italy; www.cgspace.it/), a subsidiary of space and technology group OHB SE (Bremen, Germany; www.ohb.de/de/), is equipped with electro-optical instrumentation that combines a hyperspectral sensor with a medium-resolution panchromatic camera.

Leonardo (Rome, Italy; www.leonardocompany.com), prime contractor for PRISMA’s hyperspectral imaging instrument, solar panels, and power supply unit, turned to Sofradir to design a pair of infrared detectors that would serve as the heart of the hyperspectral instrument, each built around the Saturn model infrared detector.

The first is a Saturn 1000 x 256 resolution shortwave infrared (SWIR) detector, with 30 µm pitch and a usable spectral band of 0.9 to 2.5 µm. The second is a Saturn 1000 x 256 VISIR (visible - near infrared) detector with 30 µm pitch and a usable spectral band of 0.4 to 1.1 µm. The detectors are cooled via thermal link to a cold space-facing radiator, a system that Sofradir developed specifically for the PRIMSA satellite. The completed hyperspectral instrument can detect 239 hyperspectral bands of less than 12 nanometers wide each in the SWIR and visible range (400-2500 nm).

The PRISMA satellite was launched on the Vega rocket, from the Kourou, French Guiana Space Center, on March 22. A three-month testing phase will precede the beginning of operational activity in June 2019. The satellite will orbit the Earth at an altitude of 615 km and take up to 223 images per day, with each image encompassing 30 x 30 km.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!