NASA’s JPL researchers operate a robotic arm with Kinect 2 and Oculus Rift

Dec. 31, 2013
NASA’s Jet Propulsion Laboratory has paired a Microsoft Kinect 2 with an Oculus Rift virtual reality headset to control a robotic arm, a sign that may lead to the technologies being used in future space missions.

As part of Microsoft Kinect 2’s Windows developer program, NASA’s Jet Propulsion Laboratory has paired the motion sensor with an Oculus Rift virtual reality headset to control a robotic arm, a sign that may lead to the technologies being used in future space missions.

More Vision Articles

How accurate is the Kinect?

Enhanced 3D sensing capabilities in Microsoft Kinect

NASA's JPL researchers operate a robotic arm with Kinect 2 and Oculus Rift

Kinect -based sign language interpretation

Kinect helps researchers steer roaches to explore and map disaster sites

This past summer we reported on some initial reactions of Inition’s Vertigo Simulator, which paired an Oculus Rift with a Kinect to create a “frighteningly real” virtual reality environment. While the system was touted as a gaming revolution, we had speculated at the time that the intersection of virtual reality and vision systems would eventually find its way into other non-industrial vision applications, including robot vision. More than six months later, the JPL paired the two technologies to manipulate a robotic arm in a test environment.

By now, most of you are likely familiar with the Kinect and the multitude of novel uses it has spawned, but the quick background information on the Oculus Rift is that it is a headset with a 7” screen with 24 bits per pixel that mimics normal human vision by allowing the user’s left eye to see extra area to the left and their right eye to see extra area to the right. Its field of view (FOV) is more than 90° horizontal and 110° diagonal, more than double that of similar devices, to create a sense of immersion in gameplay or simulation.

JPL is intimately familiar with the Kinect, having participated in the initial developer program, but the new Kinect 2 offers more accuracy and a more immersive experience than the first, according to JPL Human Interface Engineer Alex Menzies, who told Engadget that the combined technologies are nothing short of revolutionary.

Page 1 | Page 2

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!