Virtual reality environment enables user to instruct, collaborate with industrial robots

March 10, 2014
A team from Johns Hopkins’ Computational and Interactive Robotics Laboratory (CIRL) has developed an immersive virtual robotics environment (IVRE) that enables users to instruct, collaborate, and interact with a robotic system in simulation or in real-time by using a virtual proxy.

A team from Johns Hopkins’ Computational and Interactive Robotics Laboratory (CIRL) has developed an immersive virtual robotics environment (IVRE) that enables users to instruct, collaborate, and interact with a robotic system in simulation or in real-time by using a virtual proxy.

IVRE provides users with a range of virtual tools for manipulating the robot, displaying information, and interacting with the environment. The Oculus Rift-based environment was designed to allow for interaction with remote systems or systems that are either in hazardous surroundings or are unsafe for physical interaction with a human. The system, according to the CIRL, could be helpful in a number of different ways, including:

  • Improved presence, which allows users to have a similar experience when interacting with the robot as if they were performing the tasks in the real world.
  • Allowing the user to interact with information about the robot or task in a more efficient way via virtual displays, objects, and tools.
  • Improved safety. Virtual reality will provide the same “grab and move” programming capacity without requiring any physical contact with the robot.

In addition, IVRE will enable the virtual interaction with objects, or “actables.” 3D virtual objects are any “user-created representation of geometry” in a scene. Actables can be manipulated by the user, resized, repositioned, and deleted. For example, if a ball is on a nearby table and the user wants the robot to grab that ball, they can create an “actable sphere” to define the ball, allowing the robot to interact with it.

Virtual user interfaces for the IVRE also provide a way for the user to interact with the robot, including changing modes, creating or deleting resources, and changing views.

View more information on IVRE

Also check out:
(Slideshow) Robots and research: Eight examples of innovative imaging applications
Google working with Foxconn on automation robots
(Slideshow) 10 innovative current and future robotic applications

Share your vision-related news by contactingJames Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox, click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter | Check us out on Google +

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!