Page 2: NASA’s JPL researchers operate a robotic arm with Kinect 2 and Oculus Rift
NASA’s Jet Propulsion Laboratory has paired a Microsoft Kinect 2 with an Oculus Rift virtual reality headset to control a robotic arm, a sign that may lead to the technologies being used in future space missions.
"We're able for the first time, with [a] consumer-grade sensor, [to] control the entire orientation rotation of a robotic limb. Plus we're able to really immerse someone in the environment so that it feels like an extension of your own body -- you're able to look at the scene from a human-like perspective with full stereo vision," he said. "All the visual input is properly mapped to where your limbs are in the real world."
The current version of the paired technology has latency issues, as the robots are on a long time delay, but there is a "ghosted state" to indicate where your arm is and a solid color shows where the robot is, so the latency is displayed on the screen. Victor Luo, another JPL engineer, suggests that the early limitations of the device is due to the fact that the JPL team is using technologies that weren’t designed for space exploration, and that the end goal is not to just control robot arms, but space robots in general.
"We want to integrate this work to eventually extend that to controlling robots like the Robonaut 2," Luo told Engadget. "There are tasks that are too boring, too menial or even too dangerous for an astronaut to do the task, but fundamentally we still want to be in control of the robot ... If we can make it more efficient for us to control them, we can get more done in less time."
View the Engadet article.
Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design
To receive news like this in your inbox, click here.
Page 1 | Page 2