March 2019 snapshots: Mine-detecting drones, holographic security labels, AI air traffic control, and industry forecasts for the vision industry

March 22, 2019
In the March 2019 snapshots, a team of academic researchers pioneer a mine detection system that could save thousands of lives every year, embedded vision systems provide complex holographic labels for document security, AI software helps Heathrow Airport air traffic control adapt to cloudy weather, and the AIA presents an Automation Market Update that signals a short-term, glacial pace for the vision industry.

Embedded vision system powers advanced holographic label production

Combustión Ingenieros (Bogota, Colombia;http://www.cihologramas.com/index.php/en/), a producer of holographic products such as labels and manufacturing equipment, has utilized an embeddedvision system from Vision Components (Ettingen, Germany: https://www.vision-components.com) to develop Firefly, a microlithography system that allows users to upload image files and use those images as templates for producing hologram designs that can be used as masters for furtherproduction.

Holographic labels are used as an authentication and security measure for official documents like driver’s licenses, credit cards, or currency because it is very difficult to forge a hologram unless one is in possession of the master hologram. The ability to produce master copies of high-quality holographic labels therefore enables organizations to design their own levels of security for document protection. The more complex the hologram, the more difficult it is to forge that hologram. The Firefly microlithography system from Combustión Ingenieros allows users to design these high-quality master holograms.

The Firefly machine works by projecting a laser onto a photoresist plate, which is then treated by a chemical to develop the master hologram. This laser etching process requires precise positioning. For the hologram to have good levels of contrast, the optics cannot lose focus, which requires rapid refocusing. Combustión Ingenieros achieves this in the Firefly by using a Vision Components embedded vision camera, the VCSBC nano Z 0011, to guide the laser with 50-µm accuracy to draw patterns between 200 - 300 µm, and to refocus the Firefly’s opticsevery 20 ms.

The Vision Components system captures and processes the image and transmits the position data to a piezoelectric nanopositioning system via an RS232 interface. According to Combustión Ingenieros, this process only takes a fewmilliseconds.

The VCSBC nano Z 0011 features a single-board camera with a 1.3 MPixel CMOS image sensor from Teledyne e2v (Chelmsford, UK,www.teledyne-e2v.com) that achieves a frame rate of 63 fps. Additionally, the system has a 40 x 65 mm footprint and utilizes a Xilinx (San Jose, CA, USA; https://www.xilinx.com) Zynq dual-core Cortex-A9 ARM processor with a 2 x 866 MHz clock rate and integrated FPGA. The embedded system is equipped with 1 Gbit Ethernet, RS232 serial, and I²C interfaces, 12 inputs and outputs including an opto-isolated fast trigger input and flash-trigger output, and a battery-backed real-time clock.

The Firefly machine is controlled by a computer running the Linux operating system (OS), the VCSBC nano Z 0011 runs the VC Linux OS, and Combustión Ingenieros wrote custom Python code for the vision system. Pre-programmed functions in the VC Lib library of machine vision functions reportedly helped to streamline development for the OEM.

Trials conducted on AI platform designed to assist with air traffic control

NATS (Whiteley, UK;https://www.nats.aero), a company that provides air traffic control services in the UK, is conducting a trial at Heathrow Airport of new artificial intelligence (AI)software designed to assist with reduced visibility conditions that result in delays forpassengers.

The control tower at Heathrow Airport is 87 meters tall, providing air traffic controllers an excellent view of the airport and surrounding area. The control tower is tall enough to be engulfed by low-cloud conditions, however. In these conditions, air traffic controllers lose the ability to visually confirm when an aircraft has left the runway and must rely entirely on radar, which in turn necessitates giving each landing extra time to make sure the planes have cleared the runway. This reliance on radar alone results in a 20% loss of landing capacity, which results in delays forpassengers.

The trial for the artificial intelligence-based tracking system designed to address this issue is taking place in the Digital Tower Laboratory, a NATS facility at Heathrow Airport that represents a £2.5 million investment by the company. The laboratory is focused on research into how technology can support air traffic operation.

“We’re delighted to be working with NATS to bring this pioneering technology to the UK’s only hub airport,” said Kathryn Leahy, Director of Operations at Heathrow. “Our capacity challenges are unique to our operation and we’re always exploring new and innovative techniques to help us overcome these constraints and improve the passenger experience in a safe and resilient manner.”

The system being tested consists of 20 ultra HDcameras and an artificial intelligence-based platform developed by Searidge Technologies (Ottawa, ON, Canada; https://searidgetech.com) called Aimee. Data from the cameras, placed at ground level such that they cannot be obscured by low cloud conditions, is fed into the Aimee platform. The AI interprets the images to identify aircraft, tracks the aircraft via the camera network, and alerts a flight controller that the aircraft has left the runway. The flight controller can then confirm the information from the AI and clear the next arriving flight.

The current trial is non-operational and will determine whether the system will be deployed in 2019. The Aimee platform will track over 50,000 aircraft during the trial, the AI’s accuracy will be measured, and the results will be presented to the Civil Aviation Authority. NATS believes the system will allow Heathrow to reclaim all lost capacity due to low cloud conditions, and that this technology may someday control Heathrow’s third runway.

NATS and Searidge Technologies are also conducting research at Changi Airport in Singapore. The research involves the development of a “smart tower” prototype, with operation trials plannedfor 2019.

Automation Market Update from A3 Business Forum predicts short-term slowdown

The 2019 Automation Market Update delivered at theA3 Business Forum by Alex Shikany, Vice President of the AIA (Ann Arbor, MI, USA; https://www.visiononline.org), predicted slow growth in the robotics, vision, and motion control industries in the short-term, with a positive long-term outlook owing to a rising tide of innovation and the multitude of industries that increasingly are making use of thesetechnologies.

The performance of the semiconductor industry is closely tied to the performance of therobotics, vision, and motion control industries. According to the AIA’s data, toward the end of 2018 the Semiconductor Sector Index (SOX) took a clear drop and the year-after-year % change began decelerating toward zero.

“That’s telling you that the e-brake has been pulled in the last month or two on this industry,” said Shikany.

Automotive OEMs are the only slice of the robotics industry that has seen a decline. Other industries combined have seen robotics orders grow 24% year-to-date. “We have never been this close in terms of percentage ratio [between automotive and non-automotive robot orders] in all of our years collecting data,” said Shikany.

While growth may not be as quick as in the past, the robotics industry is still forecast to continue growing steadily.

The machine vision market is also slowing where the pace of growth is concerned, with seven percent growth in 2018 and a predicted growth rate between three and five percent in 2019. However, the market is vibrant. Shikany presented a long list of recent merger-and-acquisition activity in the machine vision market, as companies attempt to broaden the varieties of vision solutions they can offer their customers. The long-term sales predictions for computer vision revenue describe an exponential growth curve.

According to Shikany, among the industries in which vision innovations take place, the embedded and computer vision sector is the one to watch most closely. “This is a significant area of development within the vision and imaging community at large,” said Shikany. “Very low cost, very low power consumption hardware can achieve very good results with the proper software stack.”

The motion control industry is more tempered and less volatile historically than robotics and vision, according to Shikany, owing in large part to motion control companies having diversified the industries they serve. Shikany saw 2018 as a growth year for the motion control industry. In keeping with robotics and vision, however, he sees a slump for motion control coming in 2019.

According to Shikany, the future for all these industries is collaborative automation, with humans working closely with, repairing, and programming automation systems in a growing number of industries. “This relationship between human beings and robotics, or human beings and machines, is only going to continue to become more important.”

Drones with infrared imaging effective at detecting mines

Five scientists from Binghamton University (Binghamton, NY, USA;https://www.binghamton.edu) have demonstrated the potential for thermal vision systems deployed on low-altitude unmanned aerial vehicles (UAV) to detect landmines more quickly and efficiently than human-dependent methods.

Mines like the PFM-1 anti-personnel “butterfly” mine deployed by the Soviet Union during its war in Afghanistan, often to block access to high-altitude mountain passes, are of particular concern for humanitarian groups removing unexploded landmines from former combat zones. The PFM-1 is made largely of polyethylene and thus difficult to detect via traditional EMI methods like metal detectors without the return of falsepositives.

The team from Binghamton University in 2017 conducted a proof-of-concept study in Chenango Valley State Park, NY, USA. 18 inert PFM-1 mines were placed in a former parking lot, in four different orientations, to create a 10 x 20 m facsimile of the type of minefield created by aerial dispersion of the weapons in the environments in Afghanistan in which the mines were often deployed.

A 3DR (Berkeley, CA, USA;https://3dr.com) Solo quadcopter search drone used a FLIR (Wilsonville, OR, USA; www.flir.com) Vue Pro R 640 x 512 longwave infrared (LWIR) camera with 13 mm focal length. Data from preprogrammed UAV flights was collected using the pix4Dcapture app on an Android tablet linked to a 3DR radio controllervia Wi-Fi.

The images were pre-processed using ThermoViewer software to create 14-bit raster data sets. The data was then exported to pix4Dmapper software for photogrammetric processing from Pix4D (Lausanne, Switzerland;www.pix4d.com). The thermal imaging data was supplemented with aerial photography taken with a DJI (Shenzhen, China; https://www.dji.com) Phantom 4 20 MPixel camera.

The LWIR photography was able to easily detect the mines at and shortly following sunrise. In early morning flights, the mines were cooler than their surrounding objects due to the mines’ inability to retain heat as well as the natural objects around them. Once struck by sunlight, the mines heated much faster than objects in the environment. Temperature differentials once again become pronounced at and after sundown. Mines were detected regardless of their orientation and the researchers were able to identify all 18 inert mines with an average of 77.88% accuracy over multiple test flights.

This drone-based technique may not be able to fully replace human efforts. Used as a cooperative tool, however, this method can discover mine presence, minefield orientation, and minefield overlap, data that could make demining less expensive due to search area reduction, and lead to the increased safety of those attempting to removethe mines.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!