One of the most discussed topics every year at the SPIE Defense, Security + Sensing Conference is image fusion: the ability to merge images of different wavelengths into a single image. This year was no exception with many companies demonstrating technologies and products that move the Defense Advanced Research Projects Agency (DARPA; Arlington, VA, USA;www.darpa.mil) goal of an integrated visible/near infrared/SWIR (VNS) and LWIR sensor closer to reality.
One of the major reasons to develop such a device, according to DARPA, is to eliminate the need for army personnel to carry both visible and IR-based optical weapons systems into battle. Since separate cameras used for each spectrum add size and weight to these systems, DARPA needs an integrated optical VNS/LWIR single-sensor system.
To achieve this, DARPA wants a dual-mode detector ensemble (DUDE) using a stacked four-color VNS/LWIR focal plane of 2048 × 1536, 8.5-μm pixels in the VNS integrated with a 1024 × 768-pixel LWIR detector with 17-μm pixels (seehttp://bit.ly/9jX3q7).
According to David Dawes, manager of DoD Business Development at Sensors Unlimited - Goodrich ISR Systems (Princeton, NJ, USA;www.sensorsinc.com), the contract to develop this device has been awarded to a consortium of DRS Infrared Technologies (DRS; Parsippany, NJ, USA; www.drsinfrared.com), Goodrich, and Duke University (Durham, NC, USA; www.duke.edu).
While the development of this kind of device is expected to take at least five years, companies at this year’s SPIE show were already showing the benefits of image fusion techniques using multiple detector solutions.
For its part, Goodrich showed a system dubbed Hinted SWIR, which uses the company’s 640 × 512-pixel SU640KTSX InGaAs 30-frame/s solid-state camera in conjunction with a 320 × 240-pixel E3500S VOx microbolometer-based 60-frame/s camera from DRS. “Since the format and the frame rate of each camera are different,” says Dawes, “images from both must be synchronized and merged to form a single coherent image.”
To accomplish this, the Hinted SWIR system uses an ARC-1000 board from Equinox Sensors (New York, NY, USA;www.equinoxsensors.com) to provide nonlinear co-registration of images from both cameras. While the SWIR sensor provides background details, the thermal sensor hints highlight scene features-of-interest. In a demonstration at DSS, Goodrich showed an urban night scene captured with both SWIR and LWIR images. Using the Equinox ARC-1000 board, the blending of an SWIR image and an LWIR image produce a composite image that shows “thermal hints” within the SWIR image (see Fig. 1).
“One of the major difficulties in using such dual camera systems for image fusion,” says Michael Piacentino, technical director of vision systems at Sarnoff (Princeton, NJ, USA;www.sarnoff.com), “is that parallax errors become more noticeable at short range.” This parallax error can be overcome either electronically or optically.
In the case of the latest DVP-4000 video processor board from Equinox and Sarnoff’s Acadia II vision processor, digital image warping can be used to correct for both parallax error and optical distortion between the two cameras.
At the SPIE show, Sarnoff also showed an optical fusion approach to this problem that can use dual or multiple visible, SWIR, MWIR, or LWIR detectors. Called the Optically Electronically Fused Camera (OEFC), the unit uses a combination of both optical and electronic techniques to merge two or more images.
To demonstrate the technology, Sarnoff showed how the amount of fusion could be varied between full visible, 50% visible and 50% IR, and full LWIR frequencies. In the demonstration, Arnold Kravitz, director of imaging products and services at Sarnoff, attempted to conceal an AR-14 tactical carbine rifle using a white lab coat. The results of image fusion using the OEFC show the various results at the full visible, 50% visible and 50% IR, and full LWIR frequencies (see Fig. 2).