Astronauts control planetary rover from space

Aug. 5, 2013
While aboard the International Space Station, NASA flight engineers remotely operated the K10 planetary rover hundreds of miles below in California. The concept, which was tested to determine how well an astronaut in an orbiting spacecraft could remotely operate a robot on a planetary surface, could be used in future missions.

While aboard the International Space Station, NASA flight engineers remotely operated the K10 planetary rover hundreds of miles below in California.

The Surface Telerobotics exploration concept test, which was performed on June 17, was completed in order to determine how well an astronaut in an orbiting spacecraft could remotely operate a robot on a planetary surface. NASA suggests that in the future, this approach could be used in future missions.

For more than three hours, Flight Engineer Chris Cassidy controlled NASA’s K10 robot in the Roverscape, an outdoor robotic test area in Moffett, Calif. with a lunar-like terrain that is the size of two football fields. A little more than a month later, fellow Expedition 36 Flight Engineer Luca Parmitano of the European Space Agency remotely controlled the rover and began deploying a simulated Kapton film-based radio antenna, which represents the first time NASA’s open-source Robot Application Programming Interface Delegate (RAPID) robot data messaging system was used to control a robot from space.

K10’s inspection camera is comprised of a stereo pair of Point GreyScorpion cameras mounted atop a Directed Perception pan-tilt unit. The cameras feature Sony ICX CCDimage sensors, 1600x1200 pixels, synchronized shutters, and 8-bit or 16-bit digital video data output. A third camera is mounted on the center rig of the camera rig for taking HDR panoramic photos, according to a research paper on the K10. In order to resolve millimeter scale features at distances up to 5m, a 35mm Schneider Optics Xenoplan telecentric lens—which is designed to work with all standard 2/3” CCD cameras—is utilized on the panorama camera.

The Surface Telerobotics tests performed this year simulate a possible future mission involving astronauts aboard NASA’s Orion spacecraft traveling to the L2 Earth-moon Lagrange point, which is a spot 40,000 miles above the far side of the moon where the combined gravity of the Earth and moon allows a spacecraft to maintain a stationary orbit. This idea was developed by the Lunar University Network for Astrophysics Research (LUNAR), based at the University of Colorado, Boulder.

“Deploying a radio telescope on the farside (sic) of the moon would allow us to make observations of the early universe free from the radio noise of Earth,” said Jack Burns, a professor at CU, director of LUNAR and co-investigator at NASA's Lunar Science Institute in a press release. “The Surface Telerobotics test represents a next step in new modes of exploration that will bring together humans and robots, as well as science and exploration. Such telerobotics technology will be needed for exploration of the moon, asteroids and eventually the surface of Mars.”

View the NASA press release.

Also check out:
NASA releases image of Saturn’s view of Earth.

Largest camera in the world to create 3D map of Milky Way
Multispectral imaging captures views of Mars

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox, click here.

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!