Researchers to develop intelligent wheelchair

May 24, 2004
MAY 24--Computer scientists at the University of Essex (Colchester England; www.essex.ac.uk) have been awarded a grant to develop an intelligent robotic wheelchair.

MAY 24--Computer scientists at the University of Essex (Colchester England; www.essex.ac.uk) have been awarded a grant to develop an intelligent robotic wheelchair. Researchers from the department of computer science will work alongside scientists from the Institute of Automation (Beijing, China) in developing the advanced technology needed for a RoboChair that will enable the elderly and disabled to gain increased mobility and live more independently.

The RoboChair will have a man-machine interface and the capability to navigate, avoiding collision and planning a path. It will be equipped with a vision system and a 3G wireless communication system so that caregivers or relatives can monitor and communicate remotely when necessary.

Professor Huosheng Hu will lead Essex's human-centered robotics team in developing algorithms for sensor fusion, map-building, intelligent decision-making, and tele-operation through the Internet using 3G mobile phones. Professor Kui Yuan of the Institute of Automation will develop prototype hardware and control software, including servo drivers, DSP-based control systems, sensor systems, and motion-control algorithms.

Professor Hu explained why a RoboChair will be beneficial in today's society: "Although traditional wheelchairs are widely used by the elderly and disabled they have rather limited functions and flexibility. Support from relatives and caregivers is often required, but this can be inappropriate as the involvement of relatives is getting more difficult and the cost of running care and health services is very high. Today's technology development has reached a stage where we develop solutions that allow the elderly and disabled to have necessary mobility to both stay at home and go out independently with the monitoring and services provided from the remote sites.

"We will focus on the development of two levers of complexity. One is a DSP-based control system that is used to achieve good control stability, image processing capability and autonomous navigation performance. Another is based on pervasive computing technology that is used to implement an interactive user interface such as voice control, emotion and gesture detection, and wireless communication with relatives and caregivers remotely."

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!