Optical flow imaging helps NASA monitor aircraft flight
At NASA (Moffet Field, CA), research is underway to develop a supersonic passenger airplane called the high-speed civil transport. For this design, NASA scientists are seeking to replace the aircraft's forward cockpit windows with electronic displays that would show images obtained from video cameras mounted outside the aircraft.
At NASA (Moffet Field, CA), research is underway to develop a supersonic passenger airplane called the high-speed civil transport. For this design, NASA scientists are seeking to replace the aircraft's forward cockpit windows with electronic displays that would show images obtained from video cameras mounted outside the aircraft. According to researcher Jeffrey McCandless at the NASA Human Information Processing Research branch, these video images will allow computer-vision algorithms to determine whether another aircraft is in the field of view.
"One way to detect aircraft is based on pattern recognition," says McCandless, "but because of the relatively small target size—fewer than 30 pixels at distances greater than one mile—and background clutter, this technique is not feasible." Instead, optical flow—the motion of brightness patterns in an image—is being used. In this approach, an object moving in the scene produces a different image-velocity profile than those of the surroundings. An optical flow algorithm then computes the vectors associated with the moving object.
Images captured by a CCD camera mounted on a Boeing 737 aircraft indicate the presence of a target Beech King aircraft (top). The distance to the target plane (indicated by the black arrow) is about one nautical mile, making it difficult to detect the plane in the image. An optical flow algorithm, developed at NASA, correctly located the Beech King aircraft, as indicated by the white pixels at the end of the arrow (bottom).
To detect and track aircraft within video images, McCandless has developed an algorithm that first compensates for image motion caused by camera rotation. Then, the image is smoothed by a Gaussian filter, and optical-flow vectors are computed. "After discarding the optical-flow vectors with small magnitudes, other optical-flow vectors—called clustered vectors—are extracted from the optical-flow patterns. Then, a predictive technique is used to estimate future locations, magnitudes, and directions of the optical-flow vectors," says McCandless.
To test the algorithm, flights were conducted at NASA's Langley Research Center using a Megaplus ES 1.0 monochrome CCD camera from Roper Scientific MASD (San Diego, CA) mounted below the nose of a Boeing 737 aircraft. This camera was used to capture images of a Beech King Air 200 aircraft flying in different trajectories around the Boeing aircraft. After digitizing the acquired 8-bit monochrome images at 30 frames per second in 8-bit format, the images were recorded in S-VHS videotape format and then digitized using an Onyx computer from SGI Inc. (Mountain View, CA).
To monitor each individual frame, a time code was inserted on the video track. In addition to video imagery, navigational data were used to estimate the pitch, roll, and yaw at each frame time. "Because the camera was rigidly fixed to the Boeing aircraft, the rotation angles of the camera are independent of the point on the aircraft at which they were measured," says McCandless. "After running the optical flow algorithm, flight test imagery was capable of pinpointing the Beech King aircraft at a distance of about one nautical mile from the camera on the Boeing aircraft," he adds.