Image processing software: Graphical processing units speed image enhancement

July 22, 2015
Learn more about image processing software developed by EM Photonics that mitigates the effects of atmospheric turbulence and restores image quality to the diffraction limit of the system.

In military, surveillance, and aerospace applications, distortions in the atmosphere can degrade the quality of images captured over long distances. Because of the changing nature of atmospheric turbulence, individual images in a frame sequence will appear blurred in different ways.

To address this problem, EM Photonics (Newark, DE, USA; www.emphotonics.com) has leveraged work originally performed at the Lawrence Livermore National Laboratory (LLNL; Livermore, CA, USA; www.llnl.gov) in developing a number of tools capable of mitigating the effects of atmospheric turbulence and restoring image quality to the diffraction limit of the system. The company's GPU-accelerated, real-time image enhancement tool, ATCOM (www.atcomimaging.com), operates by combining information from multiple, temporally-adjacent frames to estimate a turbulence-free image.

Bispectrum Averaging Speckle Imaging can compensate for distortion effects at both visible and IR wavelengths and under heavy atmospheric conditions.

The core algorithm is based on a version of the Bispectrum Averaging Speckle Imaging method developed by LLNL to compensate for scintillation and warping effects at both visible and IR wavelengths and under turbulent atmospheric conditions. To date, EM Photonics has developed image processing software and hardware systems based around this technology for the real time removal of this distortion.

"After each image is digitized," explains Eric Kelmelis, CEO of EM Photonics, "it is decomposed into a number of different tiles or blocks, with each tile containing redundant (overlapping) image data from adjacent tiles. An apodization window is applied to the image to eliminate any ringing effects that will be generated by the following 2D FFT stage. This FFT data is then used to estimate the magnitude and phase of the Fourier transform of each individual tile.

By averaging the squared magnitude of the 2D FFT of multiple frames, the image power spectrum is obtained. Simultaneously, the multiple frame average of the triple correlation of the 2D FFT is used to obtain the phase in the spatial frequency domain. Both magnitude and phase data are then recombined and an inverse 2D FFT (IFFT) used to reconstruct the restored image tiles which are then recomposed into a single image. In this way, image distortion is greatly reduced.

Top: ATCOM software showing original images acquired at a distance of approximately 2km processed using an NVIDIA GeForce 980 GTX computer allow atmospheric distortion effects to be reduced. Bottom: ATCOM TM-1: a streaming image enhancement appliance.

"Although many off-the-shelf software tools are available to perform image filtering, FFT, and IFFT processing," says Kelmelis, "porting the bispectrum phase estimation to such GPUs is non-trivial."

However, running all of these functions on such processors can result in the high computational throughput required for real time processing.

To date, EM Photonics has developed a number of image processing systems based around this technology. In addition to a desktop software application, ATCOM is also available as a rack-mount PC, containing an HD-SDI input and output card capable of streaming high definition video.

Equipped with a single Nvidia GeForce GTX 980 card, the unit is capable of processing 720p video at over 30fps. For maximum performance this can be equipped with up to two Nvidia GeForce Titan X video cards and can process 1080p video at speeds of over 30 fps. The EM Photonics ATCOM software runs on any Windows or Linux PC.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!