JUMP-DIFFUSION SOFTWARE HELPS target-tracking systems RECOGNIZE INFRARED IMAGES

Dec. 1, 1998
In many military imaging applications, forward-looking infrared (FLIR) radar systems are used to capture signatures of complex targets. In these systems, image-understanding algorithms are needed to interpret these targets and make decisions regarding the objects within such scenes. Traditional approaches to this problem have used image-preprocessing

JUMP-DIFFUSION SOFTWARE HELPS target-tracking systems RECOGNIZE INFRARED IMAGES

By Andrew Wilson, Editor at Large

In many military imaging applications, forward-looking infrared (FLIR) radar systems are used to capture signatures of complex targets. In these systems, image-understanding algorithms are needed to interpret these targets and make decisions regarding the objects within such scenes. Traditional approaches to this problem have used image-preprocessing

algorithms to enhance contrast and reduce noise while using edge-detection, segmentation, and feature-extraction algorithms to determine the objects within the images.

As the thermal signatures of objects within infrared images may vary depending on their orientation and position, and the number of objects contained within each scene may vary as well, image-detection algorithms have tended to be diverse depending on the specific infrared applications. Because of this, many researchers are studying pattern-matching techniques to provide a way of understanding the objects within such scenes.

When using pattern-matching techniques, traditional image-segmentation and feature-extraction techniques are not used. Instead, object identification is performed by estimating the configuration of objects within an infrared image using IR object data templates. By transforming these templates over measured infrared data and computing the best match, objects within each image can be identified. At Johns Hopkins University (Baltimore, MD), Prof. Michael Miller and his colleagues have used such infrared templates along with a novel jump-diffusion algorithm to characterize the objects in IR scenes.

Infrared templates

Before the jump-diffusion algorithm can be applied to infrared images, however, IR templates of potential objects in a scene must be built. To do so, Miller turned to the Physically Reasonable Infrared Signature Model (PRISM) software package developed at the Michigan Technological University (MTU) Keweenaw Research Center (KRC; Houghton MI). Written at MTU/KRC for the US Army Tank-Automotive and Armaments Command (TACOM; Warren, MI), PRISM is an infrared signature-prediction program with a thermal-analysis capability that is being used for US and NATO military defense applications.

Available from ThermoAnalytics (Calumet, MI), PRISM is the standard Army infrared prediction code used to identify ground vehicles and other targets (see Fig. 1). "As a Windows- or UNIX-based heat-management design and analysis tool, PRISM will be further developed as the Multi-Service Electro-optic Signature (MuSES) code and licensed to industrial contractors in the defense industry," says Keith Johnson, president of ThermoAnalytics.

To build an infrared template of a particular object, developers can create a meshed surface representation of an object in MuSES or import computer-aided-design (CAD) files from the MacNeal Schwendler Corp. (MSC; Los Angeles, CA) PATRAN Neutral Files (NLT), Wavefront (OBJ), and AutoCAD (DXF) packages. Alternatively, generic models of components, systems, and vehicles can be edited, and their material and surface properties can be defined. Models can then be viewed as wire-frame or shaded representations.

After the thermal-analysis stage is performed, the physical temperature of the model`s surface is displayed on a meshed geometry representation. Using this package, Allen Curran, formerly at MTU/KRC and now vice president of ThermoAnalytics, has developed wire-frame models and infrared signature-radiance patterns of M60, M2, and T62 tanks that were later used by Miller in determining objects from real-world infrared data.

Real-world data

Once the wire-frame models have been built, they must be applied to real-world infrared image data. To accomplish this, Miller and his colleagues used a series of infrared images from different image planes that were rendered into three-dimensional (3-D) space using perspective projection and obscuration techniques. Using an Onyx workstation and Reality Engine rendering hardware from Silicon Graphics (Mountain View, CA), Miller used a z-buffer algorithm to render object pixels.

"As object pixels are rendered," says Miller, "their distance is stored in a pixel-registered z-buffer." New pixels are only written to the rendered image if their distance is less than what is stored in the buffer. After the scene is fully rendered, the final contents of the z-buffer provide ranges as a by-product of the rendering algorithm. "Essentially," Miller says, "we get range images for free." After the infrared images are rendered, the jump-diffusion algorithm is applied to locate the objects within the rendered scene.

In the jump-diffusion process, the algorithm jumps from one tracking-recognition scenario to another, searching for the correct number of objects, orientations, positions, and scales. This information is used for quickly shifting surveillance environments, for example, in active-jamming environments where the targets are deliberately creating decoys, with different sources having short lifetimes. Because the algorithms are dynamically flexible, they can accommodate both simple and complex scenes.

"Given a fixed number of objects in a scene," says Miller, "the algorithm reshapes the templates to allow for the local variability of each shape in the data." This reshaping or diffusion is performed continuously, allowing simple structures to form more complex estimates or complex structures to form more simple ones. The second part of the algorithm samples scenes of varying object numbers using discontinuous jump moves that add, remove, fuse, or split objects.

Iterative algorithms

To demonstrate the effectiveness of the jump-diffusion method, Miller and his colleagues embedded the target models provided by Curran into a real infrared image of military tanks. Although the number of objects in the scene is not known beforehand, each target is assumed to be an M60, M2, or T62 tank. "Ideally," says Miller, "we would like sensors to produce a configuration of targets in a top-down view. But with imaging sensors, these targets are usually viewed through perspective projection" (see Fig. 2).

Because real-world infrared images are blurred to the point-spread function (PSF) of camera optics, the ideal infrared image used by Miller was adjusted to compensate for this PSF. Assuming that the targets radiate known intensities, Miller ran the jump-diffusion algorithm on the Onyx workstation. At first, the algorithm tries a "birth" on the first iteration. "The algorithm finds the M60 tank first because it can "explain`` the largest amount of data pixels with it," says Miller. In the third iteration, the algorithm mistakes the T62 tank for an M60 tank because it has not found the adjacent M2 tank and is trying to understand the orientation of some of the M2`s pixels using the gun barrel of the M60 tank. "This demonstrates the importance of moves that allow changes of target type," says Miller.

In the 24th iteration of the algorithm, the M2 tank is located, but the algorithm finds it facing the wrong direction. "While diffusions may refine orientation estimates, they are impractical for making large orientation changes, suggesting the necessity of a jump move for making a drastic change in orientation," says Miller. Although diffusions find the correct placement of the M60 tank, the remaining M60 tank is found in the 32nd iteration.

In further iterations, the algorithm continues to propose "birth" moves that are rejected because the data do not support a fifth tank target. In the 34th iteration, the barrel of the incorrectly guessed M60 is no longer needed to explain the M2 pixels, and a "metamorph" move is used that still incorrectly supposes the T62 tank to be an M60 tank. The model is then transposed but still incorrectly supposes the T62 tank to be an M2 tank in the 68th iteration.

Between iterations 68 and 87, the diffusions pull the incorrectly hypothesized M2 closer to the correct position, and a "metamorph" move in iteration 88 correctly changes its type to a T62 tank. The correct orientation of the true M2 tank is found in the 103rd iteration. By the final 117th iteration of the algorithm, the configurations of all the types, positions, and orientations of the tanks are correctly deduced (see Fig. 3).

Prediction models

Algorithms such as the jump-diffusion process estimate objects from measured data and avoid the separate stages of edge-detection, segmentation, and feature extraction. However, such algorithms generally assume that objects in infrared images radiate known intensities. To extend this concept, better infrared prediction models will be required.

"Although the PRISM package has, up until now, been the standard Army infrared prediction code used for ground vehicles and other targets, it does not meet the needs of developers who need a fast prototyping design tool," says ThermoAnalytics` Johnson. And, because major modifications of the PRISM code are not cost-effective due to years of incremental development, MuSES will be offered to meet these requirements. ThermoAnalytics is planning a final release of PRISM (Version 3.3) by the end of 1998 and then will release MuSES in mid-1999 as a replacement.

With a native geometry representation consistent with current CAD packages such as AutoCAD, MuSES allows rapid prototyping and integrated infrared model preparation, simulation, and postprocessing functions. To manipulate various forms of CAD geometry for export to analytical models that require polygons, TACOM is developing a software package, dubbed Eclectic, to convert solid CAD models into a faced mesh to feed the MuSES editor.

"Land-based vehicles are just a small subset of the types of objects a FLIR sensor will encounter," says Miller. In the future, object-recognition systems may include descriptions of buildings, roads, runways, and natural resources such as lakes. "These could be used to help identify man-made structures, such as tanks, by adding a number of inferences into object-detection algorithms," says Miller.

Click here to enlarge image

FIGURE 1. To build an infrared template of a particular object, developers can create a meshed-surface representation of the object in PRISM or import CAD files from MacNeal Schwendler Corp. PATRAN Neutral Files (NLT), Wavefront (OBJ), and AutoCAD (DXF) packages. Alternatively, generic models of components, systems, or vehicles can be edited and their material and surface properties defined.

Click here to enlarge image

Click here to enlarge image

FIGURE 2. At the Michigan Technological University Keweenaw Research Center, Allen Curran has developed wire-frame models and infrared signature radiance patterns of US Army M60, M2, and T62 tanks that were used by Michael Miller of Johns Hopkins University to determine objects from real-world infrared data.

Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image
Click here to enlarge image

FIGURE 3. Using infrared signatures from the Michigan Technological University Keweenaw Research Center, Michael Miller has applied a jump-diffusion algorithm to real-word IR data (from top to left), to perform object recognition. In a series of images, the algorithm first tries a "birth" move to find the M60 tank. In the third iteration, the algorithm mistakes the T62 for an M60 tank. The M2 tank is located, but the algorithm finds it facing the wrong direction. Further iterations and diffusions then find the correct placement of the M60 and the remaining M60. A "metamorph" move is then used that still incorrectly supposes the T62 to be an M60. The model is then transposed but still incorrectly supposes the T62 to be an M2. Between iterations, diffusions pull the incorrectly hypothesized M2 closer to the correct position, and a "metamorph" move correctly changes its type to a T62. The correct orientation of the true M2 is found in the 103rd iteration. Lastly, the configurations of all the types, positions, and orientations of the tanks are correctly deduced.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!