Beyond Camera Link

Oct. 1, 2007
Looking forward to a new camera/frame-grabber interface standard

Looking forward to a new camera/frame-grabber interface standard

By Andrew Wilson, Editor

Since its introduction seven years ago, the Camera Link standard has been endorsed by many camera and frame grabber vendors. Replacing previous custom camera-to-frame-grabber interfaces with a standard means of interconnection, the Camera Link standard has eased the burden of system integrators and reduced the cost of cabling. At the same time, the use of standard off-the-shelf transceivers has allowed hardware developers to offer modular products in Base, Medium, and Full Camera Link modes with scalable bandwidths that can run as fast as 680 Mbytes/s. In 2000, this bandwidth was faster than any data rate of any commercial camera available.

Today, Camera Link dominates applications requiring high-speed and/or high-resolution image acquisition. With the ability to handle imaging data rates up to 680 Mbytes/s, Camera Link-compatible cameras and frame grabbers are the only standardized solution that can maintain data rates greater than the 125 Mbytes/s provided by other standards such as GigE Vision, FireWire, and USB.

Despite its numerous advantages, user demands for higher data rates, increased camera-to-computer cabling lengths, and the lack of an integrated power interface have led developers to modify and improve the standard. For example, when, in 2004, Basler Vision Components introduced its A504 series of cameras, the company used a Micron MV13 CMOS sensor-a 1280 × 1024 imager running at 500 frames/s.

To interface its product with the Full version of the Camera Link interface, Basler redefined pins such as the TX26 as a video data-transfer pin, instead of the Camera Link’s standard data valid (DVAL) definition. By doing so, the company could still use two Camera Link connectors to offer a pseudo-Full implementation of the standard (see Vision Systems Design, June 2004, p. 10).

FASTER CAMERAS

Faced with the large number of multitap CCD and CMOS imagers emerging on the market, the maximum bandwidth of 680 Mbytes/s of the Camera Link standard is not adequate. Indeed, just last year, Cypress developed a custom multitap CMOS imager for the holographic data-retrieval market. With a 1696 × 1710-pixel array running at a few hundred frames per second, the company used LVDS outputs to obtain the bandwidth required (see Vision Systems Design, March 2007, p. 15).

Limited bandwidth is not the only technical limitation to the current Camera Link standard. At the highest data rate, the standard is recommended for use only with cables that extend camera-to-frame-grabber interfaces 10 m. To extend this reach, a number of companies build both standard cable and fiber repeaters that claim to extend this distance up to 10 km (see Vision Systems Design, June 2006, p. 44). Although expensive, these fiberoptic-based repeaters are especially useful in military and harsh industrial environments where cables need to be shielded from the effects of electromagnetic interference.

Click here to enlarge image

Although extending the camera-to-computer connection through repeaters is fairly straightforward, the Camera Link development committee was also faced with demands from mainly Japanese systems developers who required their cameras to be powered. With the initial introduction of the Camera Link specification, external power was supplied to each camera through a separate interface. However, to reduce the cost of cabling further and at the same time reduce systems integration costs, the Camera Link committee developed the Power over Camera Link (PoCL) standard.

In the original 26-pin Camera Link design, pins 1, 13, 14, and 26 were assigned as ground. To maintain backward compatibility with this connector, pins 1 and 26 are reassigned as power lines that deliver up to 333 mA at 12 V or 400 mA at the lowest allowable 10 V, according to Steve Kinney of JAI. Of course, this limited power capability is only useful for a small number of cameras. As such, only a few vendors thus far have endorsed the PoCL standard (see Vision Systems Design, Feb. 2006, p. 23).

INCREASED BANDWIDTH

The need to detect smaller objects over larger fields of view and to acquire images at faster rates has driven market demand for higher-resolution and higher-frame-rate systems. In fact, today’s sensors are already capable of generating image data at rates in excess of 1.3 Gbytes/s, with the trend continuing to push image data rates to beyond 5 Gbytes/s. Current system interconnects like Camera Link are becoming bottlenecks between these high-performance sensors and the PC.

In other application areas, such as PC-to-server communication, high-data-rate interface technologies are already in use. These technologies, including 10GigE, PCI Express, and InfiniBand, are designed mainly for PC and server interconnection applications. While a camera-to-frame-grabber interface standard could be built on top of these existing technologies, limitations such as lack of a real-time trigger, excessive host CPU use for image depacketization, suboptimal bandwidth utilization, heavyweight protocols, and lack of determinism make adapting these technologies for machine-vision applications difficult, if not impossible.

Given the vast amounts of incoming sensor data and the unique needs of the machine-vision market, it is critical for the machine-vision industry to create its own higher-speed camera-to-frame-grabber interface standard to address the unique requirements of real-time high-speed high-data rate machine-vision applications.

While the full configuration of Camera Link is limited to a bandwidth of 680 Mbytes/s, higher-throughput protocols would need to offer configurations with peak data rates of 1 Gbyte/s up to 6 Gbytes/s. Future protocols would also need to provide an efficient method to support camera controls required by the most demanding machine vision acquisition applications including reliable and low-jitter camera trigger, reliable and low-jitter strobe synchronization, reliable and time-accurate camera exposure control, and reliable line- and frame-rate control.

When managing host-based protocols, such as GigE Vision, CPU use increases with data rate. Because of this, acquiring image data without a frame grabber at speeds faster than those offered by Camera Link would render today’s CPUs unusable for image processing. Even with a frame grabber managing the protocol overhead, future high-speed data rates are still capable of pushing standard host systems beyond their processing capabilities, requiring support for offloading technologies.

WHAT’S NEXT

“I have heard that a new standard will be open and released via a standards group (for example, the Automated Imaging Association or the ISO) once the first revision is finished,” says Reynold Dodson, president of Bitflow. “And BitFlow welcomes any genuine standard that advances the capabilities of the products in our market.”

Dodson has his own ideas about what the next-generation standard should encompass. “The standard should remain simple and the implementation cost low,” he says. “Camera Link is still one of the lowest-cost interconnect standard for high-speed cameras available today, and it should be kept that way,” he says.

So how will the current bandwidth be extended? While the option of using more transceivers and cables is relatively simple, the number of connectors required limits the smallest camera implementations and may require more than one PCI slot to implement a frame grabber design. Although this would be a reasonable way to double the current bandwidth, it provides neither the significant bandwidth improvement nor the future scalability for a long-term solution.

New high-speed serializer chipsets are available as off-the-shelf silicon that could achieve a bandwidth of almost nine times that of the maximum Camera Link data rate. New implementations could use physical-level hardware similar to that used in 10 Gigabit Ethernet technologies or InfiniBand implementations.

Furthermore, it may not be possible for the next digital camera-to-computer interface to be compatible with any existing standard, according to industry insiders. Due to silicon capability limitations, the standard may be an autonomous standard and not be backward compatible with Camera Link. Furthermore, since there will be a higher power requirement and lower noise threshold required to achieve 6 Gbytes/s, support for any power will be highly unlikely.

To keep system integration simple, new connectors and cables should be specified as part of the standard. Should InfiniBand connectivity be used in a new standard, then cable distances similar to InfiniBand cable specifications could be expected. Learning from Camera Link, the future standard should provide a minimum of 10-m cable reach over copper at maximum data rate with room for future extensions using optical fiber.

ERROR DETECTION

To ensure the reliability of the data transmission, error detection and provisions for future forward error correction would also be recommended for any future interface. A layered protocol model inspired by the OSI model for the communication between a camera and a frame grabber would greatly extend a protocol life by allowing the physical layer to be updated as new technologies emerge without having to create an entirely new protocol. Unlike the existing Camera Link standard, the communications protocol should be an integral part of the standard to provide broader interoperability between cameras and frame grabbers.

Plug-and-play support without vendor-specific interface files would be an added benefit to developers. Describing camera capabilities to the frame grabber using an XML file has proven to be an effective approach used by GigE Vision. Mechanisms to establish communication between a camera and a frame grabber should be an integral part of any new protocol. As a result, a minimum set of functions would need to be established to ensure a simple means to verify the connection and its quality.

The success of Camera Link and GigE Vision is the evidence that the machine-vision industry is evolving and maturing. As such, there are emerging requirements from high-speed, high-bandwidth imaging applications that are not being presently met. To meet these requirements, a new autonomous standard can offer to camera, frame grabber, and repeater vendors a basis for new product development.

Click here to enlarge image

null

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!