NASA’s JPL researchers operate a robotic arm with Kinect 2 and Oculus Rift
NASA’s Jet Propulsion Laboratory has paired a Microsoft Kinect 2 with an Oculus Rift virtual reality headset to control a robotic arm, a sign that may lead to the technologies being used in future space missions.
As part of Microsoft Kinect 2’s Windows developer program, NASA’s Jet Propulsion Laboratory has paired the motion sensor with an Oculus Rift virtual reality headset to control a robotic arm, a sign that may lead to the technologies being used in future space missions.
This past summer we reported on some initial reactions of Inition’s Vertigo Simulator, which paired an Oculus Rift with a Kinect to create a “frighteningly real” virtual reality environment. While the system was touted as a gaming revolution, we had speculated at the time that the intersection of virtual reality and vision systems would eventually find its way into other non-industrial vision applications, including robot vision. More than six months later, the JPL paired the two technologies to manipulate a robotic arm in a test environment.
By now, most of you are likely familiar with the Kinect and the multitude of novel uses it has spawned, but the quick background information on the Oculus Rift is that it is a headset with a 7” screen with 24 bits per pixel that mimics normal human vision by allowing the user’s left eye to see extra area to the left and their right eye to see extra area to the right. Its field of view (FOV) is more than 90° horizontal and 110° diagonal, more than double that of similar devices, to create a sense of immersion in gameplay or simulation.
JPL is intimately familiar with the Kinect, having participated in the initial developer program, but the new Kinect 2 offers more accuracy and a more immersive experience than the first, according to JPL Human Interface Engineer Alex Menzies, who told Engadget that the combined technologies are nothing short of revolutionary.
Page 1 | Page 2