When developing wind turbine blades, manufacturers such as GE Energy use sophisticated CAD software to both design and predict how the blades will behave under various conditions. After designs are simulated, full-scale prototypes must then be built and tested.
“Because of the complexity of the materials used and the designs,” says Carlos Jorquera, CEO of Boulder Imaging (Louisville, CO, USA; www.boulderimaging.com), “software alone cannot predict exactly how the turbine blades will perform when subjected to various forces exerted by the wind.”
Jorquera and his colleagues have developed a machine-vision system known as the Quazar Vision Inspector that is capable of real-time analysis of turbine blade motion. Using the system, designers are better equipped to understand how blade oscillations affect the performance of the wind turbine.
To image the 50-m blade as it rotates, a 2352 × 1728-pixel Falcon 4M60 CMOS Camera Link camera from DALSA (Waterloo, ON, Canada; www.dalsa.com) was positioned close to the main rotor shaft camera to look down the length of the rotating blade.
|Blob detection algorithms were used to detect known markers on a wind turbine blade as it rotates. By measuring these markers and comparing them to a known reference model of the blade at rest, deflection and twist can be automatically determined.|
“Because of the size of the wind turbine,” says Jorquera, “it was necessary to transmit images from the 60-frame/s Falcon camera over a fiberoptic interface to a remote host PC.” To accomplish this, Camera Link signals from the camera were first encoded using the RCX C-Link, a Camera Link-to-fiber adapter from EDT (Beaverton, OR, USA; www.edt.com). At the host PC, the signals were then reformatted as Camera Link signals using a second RCX adapter and captured into host memory using an X64 Xcelera-CL PX4 dual frame grabber from DALSA.
Before any measurements could be made, it was necessary to calibrate the system. A series of black and white markers was placed at strategic positions along the turbine blade. Imaging the markers on the blade at rest in the downward position is then used to produce a reference data image since there is no load placed on the blade. Any deviation from this reference data image can be used to measure the twist and deflection of the blade as it rotates.
Consistent illumination was required to obtain quality images as the turbine rotates. “Because natural light varies greatly through the day, day to day, and over seasons,” says Jorquera, “camera gain and exposure settings were constantly and consistently adjusted in real time, allowing measurable images to be captured over varying lighting conditions.”
To determine the amount of deflection and twist occurring as the turbine moves the blades, individual images were first captured and stored in host PC memory. Using blob detection, the position of each of the individual points along different regions of the blade were then determined (see figure). Any missing markers that occurred due to the effects of bright sunlight were discarded and the remaining points fitted to the calibrated image model. Deviations from this calibrated model were then plotted at different positions along the turbine blade. The resulting data show the deflection in both the x and y directions along the blade and any twisting occurring in the x and y directions. The data provided engineers at GE Wind with precise measurements on flex and twist of the blade to an accuracy of 1 mm and 1/10° of twist.
Precisely measuring the kinematics of the wind turbine blade and how it correlates to wind speed, direction, energy production, and efficiency provided the feedback needed to develop optimally designed wind turbine blades. According to Jorquera, GE Wind engineers measured unexpected results immediately, which allowed them to make adjustments before wind turbine designs were approved.
Blob detection algorithms were used to detect known markers on a wind turbine blade as it rotates. By measuring these markers and comparing them to a known reference model of the blade at rest, deflection and twist can be automatically determined.
Vision Systems Articles Archives