METROLOGY: Japanese companies display ingenuity with vision offerings at NIWeek

Oct. 1, 2012
Every year in August, National Instruments (NI) holds its annual conference and exhibition to demonstrate the latest products and applications based the company's graphical system design approach and NI LabVIEW system design software.

Every year in August, National Instruments (NI) holds its annual conference and exhibition to demonstrate the latest products and applications based the company's graphical system design approach and NI LabVIEW system design software. On the show floor this year, the organizers highlighted some of the latest vision products being developed in Japan. These included a range of rugged programmable smart cameras, a test system to measure the brightness and chromaticity of LEDs, and a system to allow operators to visually locate the origin of a sound source.

For integrators wishing to deploy rugged, explosion-proof cameras in their systems, ORIENT BRAINS (www.orientbrains.com) showed three cameras capable of daylight, near-infrared (NIR), and IR imaging (see Fig. 1). ORIENT BRAINS has chosen to integrate off-the-shelf camera modules from Basler -- for the visible and NIR versions -- and the A35 microbolometer-based camera from FLIR, embedded in rugged housings.

FIGURE 1. ORIENT BRAINS' Next-Eye programmable smart cameras use an embedded processor from NI to perform vision tasks. The camera is available in commercial (left), explosion-proof (middle), and IR versions (right).">FIGURE 1. ORIENT BRAINS' Next-Eye programmable smart cameras use an embedded processor from NI to perform vision tasks. The camera is available in commercial (left), explosion-proof (middle), and IR versions (right).

Known as Next-Eye programmable smart cameras, these devices also embed a custom-built, Atom-based compact vision board built by NI. Running real-time NI LabVIEW-based vision programs, the board is capable of processing images, triggering events through digital I/O, and outputting results over an Ethernet interface. Designed for security and rugged industrial applications, these smart cameras are now in operation at the plants of Kansai Electric Power Company (www.kepco.co.jp), where they are being used to monitor the cooling pool for unused nuclear fuel.

Display-panel testing was the focus of the Matsuura Denkosha demonstration (http://denkosha.net). There, company president Takahiro Matsuura demonstrated how a GigE-based camera system could be used to calibrate LED panels used in flat-panel display systems.

Many display systems comprise smaller 1 × 1-m panels with up to 4096 × 4096 white LEDs. To ensure each LED is of the correct brightness and color, it must be accurately driven. In the system shown at NIWeek, a small flat-panel array was first digitized using a GigE-based camera. Images from the cameras were then fed to an NI PXIe-8133 controller. In the same chassis, an NI PXI-7813R digital I/O board was interfaced to the flat panel over an I2C interface. This interface was used to control the current and pulsewidth, thereby affecting the brightness and color of each LED in the panel.

After images are digitized by the camera and transferred to the host controller, a program written in NI-IMAQ Vision Software isolates regions of interest around each LED and computes the chromaticity of each. Depending on the white point of the LED panel that each manufacturer requires, the chromaticity and brightness adjustments are then individually computed and transferred over the system's I2C interface. This results in a uniformly bright, color-corrected LED panel.

According to Takahiro Matsuura, the system is already being used by Samik Electronic (www.samikdisplay.co.kr) and Sharp Electronics (http://sharp-world.com) in the production of LED display panels.

A different kind of smart testing was shown in the booth by E.I.SOL Co. Ltd. (www.ei-sol.co.jp) where general manager Kei Hirasawa demonstrated a system that combined sound and vision. Such systems are useful when the location of a sound needs to be accurately pinpointed and located within an image. In testing print engines for laser printers, for example, it may be necessary to automatically determine the specific location of any anomalous sounds that occur.

The company demonstrated a system that used a 6 × 6 array of low-cost microphones interfaced to an NI 9234 data acquisition (DAQ) module specifically designed for high-channel-count sound and vibration applications. Positioned at the center of the array, a low-cost USB 2.0 camera was used to transfer images at 25 frames/sec to the processor and field-programmable gate array (FPGA) in the NI cRIO-9076 integrated controller and chassis, which also holds the NI 9234 DAQ module. Using a notch filter programmed in the FPGA, specific frequencies of sound are then determined and a sound map is created and overlaid onto the visual image (see Fig. 2). This was then transferred to a host PC over NI's MXI Express PC interface. In this way, specific sounds and their locations can be localized within the image.

FIGURE 2. Naoya Gomi and Kamalina Srikant of NI demonstrated how E.I.SOL Co. Ltd.'s sound visualization system can localize sound within an image.">FIGURE 2. Naoya Gomi and Kamalina Srikant of NI demonstrated how E.I.SOL Co. Ltd.'s sound visualization system can localize sound within an image.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!