As camera resolutions and speeds increase, newer scalable high-speed interfaces such as HSLINK will allow image data to be transferred to host PCs at speeds up to 6 Gbytes/s. For machine-vision systems to process image data at these rates, software vendors are tailoring their products to support FPGA-, multicore-, and GPU-based processors. Spatially and/or temporally partitioning images across these processors can then increase the throughput of their systems.
Systems have been developed that employ multicore or multiprocessor architectures across networked systems. When deploying these systems, developers may need to interface single or multiple cameras with disparate interfaces to host PCs and then partition the captured data across multiple cores. Thus, although software manufacturers may support multiprocessing capability, developers may find that third-party camera interfaces or software development kits (SDKs) are required to realize these systems.
Software vendors are now offering GigE Vision simulators that can be used to speed the creation of multiprocessor systems while at the same time easing camera interface development. By deploying these simulators, system developers can use any camera/frame grabber combination to capture images in different formats to a host PC. The host PC running the simulator then appears to the network as a GigE Vision and GenICam compliant camera. Because of this, image data can then be streamed over the network to other multicore or GPU-enabled PCs for temporal or spatial multiprocessing.
At VISION 2008 in Stuttgart, Stemmer Imaging (Puchheim, Germany; www.stemmer-imaging.de) debuted its GigE Vision Server, part of the company’s Common Vision Blox (CVB) software (see “Imaging system combines Camera Link and GigE Vision,” Vision Systems Design, February 2009).
|Images of a plasma ball captured by a FireWire camera were processed using a host PC to clean the images, remove reflections, and extract red and blue features (top). This image was then streamed to another PC over the network and further processed to complete the blob analysis, perform measurements, and generate reports and charts in real time (bottom).|
To demonstrate the capability of its GigE Vision Server, Stemmer used a 12k × 1-pixel P3-80-12k40 linescan camera from DALSA (Waterloo, ON, Canada; www.dalsa.com) to image banknotes on a rotating scanner. These images were then captured at 650-Mbyte/s Full Camera Link rates using a DALSA X64 Xcelera-CL PX4 Full PCI Express-based frame grabber, divided into blocks of 4k × 20-pixel lines and distributed to three separate client PCs through three Intel Pro 1000 GigE network interface cards.
Recognizing the advantages of the approach, A&B Software (New London, CT, USA; www.ab-soft.com) has developed a standalone package known as GigESim, which can be used with any third-party imaging software that supports the GigE Vision/GenICam standard.
According to Boris Nalibotski, general partner of A&B, the $500 software package consists of an integrated simulator, converter, and server/SDK. Like Stemmer’s GigE Vision Server, A&B’s GigESim can convert cameras with different interfaces such as Camera Link, FireWire, and USB into virtual GigE Vision cameras. GigESim can also be used as a universal GigE Vision camera simulator that can inject artificial error conditions related to the network latency and lack of determinism in the UDP protocol.
This makes it an ideal testing tool for GigE Vision application developers. The SDK—part of the GigESim software package—allows developers to present images generated by third-party applications as a simulated GigE Vision camera and stream them using a network interface card (NIC).
At The Vision Show held in Boston in May, for example, the company showed a FireWire-based camera from IMI Technology (Encinitas, CA, USA; www.imi-tech.com) pointed to a plasma ball and interfaced to a host PC running A&B’s ImageWarp software. A series of image-processing functions were applied at frame rates to clean the images, remove reflections, and extract red and blue features. Using an integrated GigESim module, ImageWarp streamed this generated video to the network as if it were coming from a GigE Vision camera. Another copy of ImageWarp running on a second PC received the video from the virtual camera, completed the blob analysis, performed measurements, and generated reports and charts in real time (see figure).
“The advantage of this approach,” says Nalibotski, “is that images generated by GigESim can be received by any GigE Vision-compliant software without adaptation.” In a typical configuration, the NIC could be interfaced to a Gigabit Ethernet switch to distribute image data to multiple PCs in the multicast mode. To increase data transfer throughput, several NICs could be teamed into one logical NIC using the link aggregation technique. This both simplifies distributed parallel and temporal processing and allows multiple multicore-based processors to be located at distances up to 100 m from image-capture systems.