Workstations fuse multispectral images for military scenes
A major challenge in designing military-critical decision-support systems involves the real-time fusion of large data sets from a variety of video signals and sources. These sets might include geospatial data, two-dimensional (2-D) and three-dimensional (3-D) imagery, and 3-Dterrain data. When used in mission-support sytems, the processing power required to fuse these data sets is compounded by the large size of the involved images, which generally represent visual or mapping data of complete continents or countries.
FIGURE 1. In many military defense-system applications, an image server is used to deliver images to attached workstations. Typically, such systems connect the image server to RAID storage subsystems that provide bandwidths of hundreds of megabytes per second to tens of gigabytes per second. For example, downloading a 70-Gbyte mosaic image to a workstation for image analysis at an ATM peak rate of 155-Mbit/s takes one hour at full bandwidth on a dedicated ATM network connection.
Now, however, thanks to the introduction of high-speed standards-based workstations and off-the-shelf networking hardware, graphics workstations tied to high-speed imaging networks are being directed at mapping, charting, and battlefield-visualization applications. In addition to providing cost-effective means for developing mission planning and situation assessments, the workstation systems also are being used for mission simulations by US military forces.
Originally developed as a proprietary imaging system, the US Navy's Topscene system was redesigned using commercial off-the-shelf (COTS) systems from SGI Inc. (Mountain View, CA) to make it more deployable and cost-effective. According to Alan Herod, program manager for the Navy's Topscene Mission Rehearsal Systems, the redesigned system exploits the graphics capability and CPU performance of SGI Onyx2 multiprocessor systems to construct country-size databases using data from the National Imagery and Mapping Agency (Riverdale, MD). "This allows the system to create 3-D terrain data from a database not more than 24 hours old for any locale on the planet in about two hours and to resolve elements of the scene that may be less than a square foot in size," says Herod.
Bandwidth limitations
"In many defense imaging applications," says Alan Dare, geospatial imaging marketing manager at SGI, "an image server typically provides services such as compression, decompression, reduced-resolution data-set generation, image storage, cataloging, mass storage administration, and serving images to clients." Furthermore, the image server might provide full-frame imagery downloading or serve subsections of images or tiles on demand. Typically, such systems connect the image server to RAID subsystems that deliver bandwidths from hundreds of megabytes per second to gigabytes per second (see Fig. 1). "While the amount of storage varies, the on-line storage of imagery from several days to several weeks in duration is common," says Dare.
FIGURE 2. To overcome problems associated with networked-based imaging systems, the Topscene 4000 mission rehearsal system uses a GroupStation concept that integrates SGI InfiniteReality graphics subsystems directly into the image server. During operation, image network downloads are not required for viewing imagery, and 67-Gbyte images can be accessed within two seconds.
However, although the bandwidth of modern COTS servers and disk subsystems is high, the networks connecting to these systems can provide only a fraction of this bandwidth. Therefore, trying to respond to a potential critical situation by analyzing a broad-area search becomes complicated. In fact, the network becomes a bottleneck. This situation is exacerbated if several image analysts are accessing imagery over the same network.
For example, downloading a 70-Gbyte mosaic image to a workstation for analysis at an asynchronous-transfer-mode peak rate of 155 Mbit/s takes one hour at full bandwidth on a dedicated network connection," says Dare. Rarely would an image analyst obtain a dedicated network for a single transfer in this architecture. "And networks based upon a fiber distributed-data interface or 100Base-T network protocol exhibit even worse performance."
Although individual analysis centers might not routinely access large-sized mosaics, the problem remains the same for centers trying to serve large amounts of tiles. As the number of image analysts accessing the network increases, their requests for tiles also increase. Consequently, the required bandwidth increases. The overall result is network saturation and a large amount of time spent by image analysts waiting for downloads.
High-performance design
To overcome these problems, the latest Topscene 4000 system uses a GroupStation method that includes the integration of SGI's InfiniteReality graphics subsystems directly into the image server (see Fig. 2). Although the GroupStation architecture also can be used as part of a digital image network, the imaging power of the system is obtained by connecting graphics subsystems directly to the image server.
"In operation," says Dare, "no image downloads are required for viewing imagery on a GroupStation terminal, and 67-Gbyte images can be accessed within two seconds of initiating an electronic light-table program. If a digital image server architecture were used, the same image would require a minimum of at least a one-hour transfer time on a dedicated asynchronous-transfer-mode network before an image analyst could access it.
Capable of displaying up to 896 million pixels/s, the graphics subsystem provides smooth roaming and zooming over large 12-bit images. For those images that need to be rectified before display, the system provides 7 x 7 convolutions and bicubic resampling in hardware (see "Using bicubic resampling to eliminate geometric distortion," p. 32).
This same graphics system is used in the Topscene system for the visualization of imagery draped over digital elevation models. Therefore, the same GroupStation terminal can be used for soft-copy exploitation, mapping, charting, geodesy, and mission rehearsal.
"Topscene's integration with the Air Force Mission Planning Support System (AFMSS) has reinvented how missions are planned," says Herod. "After designing their mission with the AFMSS, crew members can export route, threat-analysis, and other mission data and then experience the mission in photorealistic three dimensions," he says. Alterations to the mission also can be made in a Topscene system, and the data can then be exported to the AFMSS for reverification.
Images for all
Although the GroupStation terminal provides high-performance imaging, many geographic information system functions are less demanding. Although fast access to data is important in such systems, local-area-network-based data access and workstation-class systems also can be used.
Accordingly, SGI worked with other vendors to provide solutions that would allow lower-bandwidth systems to connect and share data on the GroupStation terminal. As a result, a 64-bit version of SGI's Samba open-source software for serving files to Windows clients from UNIX servers, executing on the GroupStation terminal, can export large imagery files to the desktops of Windows NT clients.
Graphics workstations enable image analysts to render three-dimensional military simulations in real time.
Company Information
National Imagery and Mapping Agency
Bethesda, MD 20816
Web: www.nima.mil
SGI Inc.
Mountain View, CA 94043
Web: www.sgi.com
Using bicubic resampling to eliminate geometric distortion
When an image is rotated, enlarged, or zoomed, draped over a digital elevation model, or corrected for geometric distortion, an interpolation process is needed. In resampling, for example, a rotated or skewed image must be converted to an unrotated or unskewed image (see Fig. A). Resampling can be interpreted as a convolution of a distorted image with a moving window function. However, in resampling the output is computed between the original pixels. Thus, the resampling weighting function is continuous, which is unlike the discrete array used in a convolution. Three commonly used methods are used to determine which value belongs in the colored (green) cell: nearest neighbor, bilinear interpolation, or bicubic interpolation.
FIGURE A. When an image is manipulated an interpolation process is needed. In resampling, for example, a rotated or skewed image must be converted to an unrotated or unskewed image. To correct a distorted image, bicubic interpolation takes the distance-weighted average of the 16 nearest pixels to the green cell value. This method produces a superior image but at the cost of increased processing.
The nearest-neighbor method is computationally the simplest method. The value belonging in the green cell would be the value of the closest pixel in the input image, disregarding any slight offset. When applying the nearest-neighbor method, the value in the green cell would be "X," and both locations would contain the value "X" in the corrected image. Due to "round-off," nearest-neighbor resampling introduces up to a ±0.5-pixel geometric distortion. Therefore, nearest-neighbor resampling produces the poorest-quality output.
Consider an image that has been enlarged and rotated. In defense and medical imaging, an estimation of the volume contained within the borders of the white lines may be necessary (see Fig. B). It is difficult to determine the exact borders that contain the volume in a processed image using nearest-neighbor interpolation.
FIGURE B. Three methods are commonly used to correct images for geometric distortion. Nearest-neighbor interpolation is the simplest method but becomes complicated when determining the borders of the transformed images (left). Bilinear interpolation takes a distance-weighted average of the digital numbers of the four nearest pixels and results in an image smoother in appearance than an image processed with nearest-neighbor interpolation (center). Bicubic interpolation, a computationally intensive method, takes the distance-weighted average of the 16 nearest pixels and produces a superior image (right).
Bilinear interpolation takes a distance-weighted average of the digital numbers of the four nearest pixels. Thus, when using bilinear interpolation, the green cell would contain a value that was composed of the cell containing "X" and the three cells containing "W." The exact value is a percentage of each of the four pixels surrounding the green cell. The percentage of each value used is based on the distance from the green cell. The image processed with bilinear interpolation is much smoother in appearance than the image processed with nearest-neighbor interpolation.
Although the determination of volume with the bilinear interpolated image is clearer, it lacks sharpness. The bicubic interpolated image has increased the sharpness and definition of the volume and the white lines. It takes the distance-weighted average of the 16 nearest pixels and produces a superior image at the cost of increased processing. This proves beneficial in defense and medical imaging applications. As expected, bicubic interpolation is more computationally complex than bilinear interpolation.
Alan Dare
Geospatial Imaging Marketing Manager
SGI Inc.
Mountain View, CA 94043