The wearable device called "CU" leverages an Intel RealSense 3D depth camera and algorithms that translate visual information into haptic and audio feedback. The depth camera used in the device is the D415 3D camera, which features a rolling shutter sensor with a maximum resolution of 1920 x 1080, a 1.4 µm pixel size, and can achieve 60 fps through its USB 3.0 interface.
In addition to the cameras, the wearable device features bone conduction speakers for audio feedback. Setup of the device is controlled by a processing hub which includes a processing unit, a GPS sensor, and a GSM module for LTE connection. Based on the exact location and movement of the vibrating feedback on the arm, the visually impaired is informed about the position and distance of things in the surroundings, according to FRAMOS. The voice-controlled glasses are connected to a haptic feedback wristband via Bluetooth. A micro processing unit translates the data from the Bluetooth into haptic feedback through a 2D array of vibration motors. The wearable device is powered by two rechargeable batteries, located on the glasses and in the wristband, that enable a full day of use.
Additionally, FRAMOS notes that the CU glasses come with a "Smart Assistant" that provides information on facial recognition, text recognition, and object recognition. This new way of sensing, according to FRAMOS, enables "visually impaired people to fully understand their environment and to have advanced guidance for safe navigation."
Dr. Christopher Scheubel, FRAMOS Business Development, commented:
"We are proud having found a way to bring state-of-art technology into an application, which provides a huge impact on the daily life of the visually impaired," he said. "This project hits the sense of innovation by really supporting humans and improving their lives. The exceptional beauty of this technology is the ability to provide visual information normally given by the human eye. Our technology creates a new way of sensing."