Camera-to-computer interfaces such as USB, Gigabit Ethernet, Camera Link, and FireWire are used in a number of machine-vision applications to transfer digital images to a host computer for processing and display. In these systems, however, the frame rate of the camera system is not the same as the refresh rate of the monitor.
In some situations, this lack of synchronization between the camera and monitor can lead to a phenomenon known as page or screen tearing, which is caused by the next frame being written to the graphic memory while the graphic controller uses it to display the previous frame. When this happens, the images on screen contain parts from different frames and thus appear torn horizontally (see figure).
To overcome this problem, the rendering of an incoming video must be digitally synchronized with the display refresh rate by triple-buffering the incoming frames and using the vertical sync signal of the graphic controller to update the graphic memory during the vertical blanking interval of the monitor. This causes new frames to only be sent to the graphic memory after the previous frame has been displayed, thus preventing images from overlapping in the display memory and eliminating screen tearing.
“To accomplish this on Windows-based systems,” says Boris Nalibotski, president of A&B Software (New London, CT, USA; www.ab-soft.com), “a typical software application would use DirectX SDK to constantly poll on the state of the vsync signal and control when each image is to be displayed. Using this method, however, can result in a significant CPU overhead, which becomes a problem if the software is performing image processing on the captured image data.”
To overcome this, A&B Software has devised a more elegant method that reduces this overhead. “Since the frequency of the monitor and frame rate of the camera can be determined by an application,” he says, “the vsync polling loop need not be running continuously and only needs to be called shortly before each vertical blanking signal.” Because Windows is nondeterministic, calling the API within a small period before this interval results in a significantly lower CPU overhead.
Although this method eliminates the display-tearing artifact, it results in another display anomaly called syncopation or “stuttering.” The syncopation is caused by a necessity to drop or duplicate a video frame in order to maintain display synchronization, and it becomes especially obvious when the camera frame rate is close to the display refresh rate.
Even the slightest discrepancy between the incoming frame rate and the display refresh rate will eventually result in dropped or duplicated fields. If, for example, a 60-Hz monitor were used with a 60.2-Hz camera, the system will only display 60 frames/s, dropping approximately one frame every five seconds.
“In many machine-vision applications not involving visual analysis by an operator,” says Nalibotski, “this is not significant. However, in applications where images simply need to be displayed and monitored by the user, this can present a problem. For many customers who are used to the smoothness of the analog video, an uneven display rate of the video coming from digital cameras becomes unacceptable.”
One of A&B’s clients, North Star Imaging (Rogers, MN, USA; www.4nsi.com), recently deployed a 1k × 1k × 60 frames/s camera from Prosilica (Burnaby, BC, Canada; www.prosilica.com) and A&B’s ImageWarp software for a quality control system. “Because the camera and the monitor were not properly synchronized,” says Brian Ruether, vice president of NSI, “the smooth motion of a product on the display would appear stuttering approximately every 5–6 s.”
To overcome this problem, A&B Software has devised a technique to genlock the camera and monitor by using the vertical blanking signal from the monitor as an input trigger to the camera. Although this results in delaying the frame rate of the camera so that its frame rate is perfectly synchronized with the monitor, the instability of the display is eliminated.
For cameras with faster frame rates, this idea could also be extended by using a fraction of the vertical blanking rate of the monitor to trigger higher-speed cameras. This patent-pending monitor synchronization method is now available on all A&B Software products, including ImageWarp image analysis software as well as ActiveDcam and ActiveGigE SDKs.