SERVICE ROBOTS: Smart car lets the blind drive

Oct. 1, 2010
Designed to highlight the use of National Instruments' hardware and software by researchers, academics, and engineers worldwide, the 2010 Application of The Year Award was presented to Dr. Dennis Hong and the undergraduate students of the Blind Driver Challenge team at Virginia Tech during NIWeek for their work in developing the world’s first vehicle capable of being safely and independently operated by a blind driver.

At a dinner reception held at NIWeek 2010 in August, David Wilson, director of academic marketing with National Instruments (NI; Austin, TX, USA;www.ni.com) presented the company’s Graphical System Design Achievement Awards. Designed to highlight the use of the company’s hardware and software by researchers, academics, and engineers worldwide, this year’s Application of The Year Award was presented to Dr. Dennis Hong and the undergraduate students of the Blind Driver Challenge team at Virginia Tech (Blacksburg, VA, USA; www.vt.edu) for their work in developing the world’s first vehicle capable of being safely and independently operated by a blind driver (see figure).

Researchers led by Dennis Hong at Virginia Tech have developed the world’s first and only automobile that can be driven by a blind person.

After accepting a challenge from the National Federation of the Blind, undergraduate students at the university’s Robotics and Mechanisms Laboratory (RoMeLa) developed the system in just two semesters with $3000 in seed funding.

“Several challenges needed to be met when developing the system,” says Hong. “These included allowing the driver to navigate a driving course, regulate speed of the vehicle, and avoiding obstacles on the road.”

In the design of the system, various sensory data are captured, processed, and presented to the operator through a number of novel nonvisual driver interfaces controlled byNI CompactRIO (cRIO) hardware and LabVIEW FPGA software. Deployed aboard a modified dune buggy, the system uses a UTM-30LX single-plane laser rangefinder (LRF) from Hokuyo (Osaka, Japan; www.hokuyo-aut.jp) to scan the environment for cones that define the course and obstacles on the road. Data from the LRF is then transferred over a USB interface and processed on the cRIO’s real-time controller.

“Using the cRIO allowed us to connect to the wide variety of sensors and actuators in our system,” says Hong, “as well as leverage the benefits of the on-board FPGA to provide the high-speed acquisition, processing, and control necessary to relay time-critical information to the driver.” Incorporating a laptop also allowed a sighted passenger to monitor the operation of the on-board hardware and software and perform on-the-fly adjustments to the embedded design throughout the development process.

Additional sensors such as a Hall effect sensor and string potentiometer were used to monitor the vehicle’s speed and steering angle, respectively. Data from these sensors were captured by various I/O modules connected to the cRIO—specifically an NI-9401 bidirectional digital I/O module for the Hall effect sensor, and an NI-9221 analog input module for the string potentiometer. All of the data are acquired and processed using the FPGA on the cRIO-9072.

After collecting the data, it was necessary to communicate this information to the driver using an array of nonvisual cues.

To allow each driver to properly regulate the speed of the vehicle, Hong and his student team developed a vibro-tactile vest that was attached to the seatbelt of the vehicle. Speed data from the Hall effect sensor were then used to control an array of motor-actuated vibrators embedded in the vest at various intensities using a NI 9485 eight-channel relay module in the cRIO chassis. Should the vehicle detect an unavoidable collision with an obstacle, the vest then cues the driver to stop the vehicle immediately.

For steering guidance, a potential field algorithm was developed that uses data from the range finder to assist with path planning. After calculating a path, the system suggests to the driver where to steer to stay in the lane and avoid obstacles. A mechanism attached to the steering column clicks every 5° to provide precise audible feedback to the driver. Using LabVIEW text-to-speech software, the driver is then instructed over a pair of headphones how far to turn the steering wheel. Steering instructions are also conveyed through another interface called DriveGrip, which uses a pair of gloves with small vibrating motors on each finger to provide cues to the driver regarding how far to turn the wheel.

To allow the driver to explore the environment and make informed driving decisions, Hong and his team developed an interface that places the processed rangefinder data directly in their hands through a physical mapping device called AirPix. This interface comprises a grid of orifices that use compressed air to generate a physical representation of the vehicle’s surroundings from the laser rangefinder data. An array of solenoids opens and closes in response to digital signals from the NI-9401 module to quickly update the map based on the most recent information from the sensor.

In the months following its development, the system has provided more than 30 blind and visually impaired people the opportunity to drive a vehicle. “Whether it was their first time behind the wheel or a long-awaited reunion with an automobile, their reactions were overwhelmingly positive and filled with hope,” says Hong.

More Vision Systems Issue Articles
Vision Systems Articles Archives

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!