October 2015 Snapshots: NASA imaging, computer vision, vision-guided robots

Oct. 5, 2015
In our snapshots section of the October 2015 issue, learn about some of the latest and most innovative in imaging, including NASA imaging, computer vision, and vision-guided robots.

NASA's New Horizons spacecraft reaches Pluto, transmits images

After first being launched in 2006 as part of the New Frontiers program, NASA's (Washington, D.C., USA; www.nasa.gov) New Horizons spacecraft made, at time of publication, its closest approach to Pluto-about 7,750 mile above the surface-on July 14, making it the first-ever space mission to explore a world so far from Earth.

New Horizons mission involves a five-month-long reconnaissance flyby study of Pluto and its moons. Recent images captured by the spacecraft also show Pluto's distinct surface features and an immense dark surface band known as the "whale." The images, captured on July 13 when the spacecraft was approximately 476,000 miles from the surface, are the most detailed images that the spacecraft has captured to date.

"Among the structures tentatively identified are what appear to be polygonal features; a complex band of terrain stretching east-northeast across the planet, approximately 1,000 miles long; and a complex region where bright terrains meet the dark terrains of the whale," said Alan Stern, New Horizons principal investigator.

The color image was combined with low-resolution color information from the spacecraft's Ralph telescope with images captured by the spacecraft's main imager, the Long-Range Reconnaissance Imager (LORRI). Ralph is a telescope with a 2.4in aperture that features a CCD imager with broadband and color channels, and a NIR imaging spectrometer.

LORRI is a panchromatic high-magnification imager consisting of a telescope with an 8.2in aperture that focuses visible light onto a 1024 x 1024 monochromatic back-illuminated, thinned CCD imager from e2v (Chelmsford, England, www.e2v.com). The instrument's silicon carbide construction keeps its mirrors focused through extreme temperature changes that the spacecraft experiences.

The LORRI system was developed by the Johns Hopkins University (Baltimore, MD, USA; www.jhu.edu) Applied Physics Laboratory. Additionally, LORRI will provide images of Kuiper Belt objects.

Computer vision platform taking shape

Funded by the European Commission's Horizon 2020 (H2020) framework, the Eyes of Things (EoT) is a computer vision consortium comprised of eight European partners that aims to build a tiny intelligent camera targeted at OEMs. The consortium consists of VISILAB (Project Coordinator; Ciudad Real, Spain; http://visilab.etsii.uclm.es/?page_id=39), Movidius (Dublin, Ireland), Awaiba (Madeira, Portugal; www.awaiba.com), DFKI (Kaiserslautern, Germany; www.dfki.de/web?set_language=en&cl=en), Thales (Nanterre, France; www.thalesgroup.com), Fluxguide (Wien, Austria, www.fluxguide.com), nViso (Lausanne, Switzerland, www.nviso.ch) and Evercam (Dublin, Ireland; www.evercam.io).

The idea of the project is to fit a camera, processor and WiFi into a pen-drive sized component. As it stands, the bill of materials for the EoT device is estimated at only $12. In the coming months, new versions of the device will be released, which will closely resemble the final device. The EoT device will target surveillance, portable cameras and consumer goods such as toys.

The first public demonstration of the concept was shown the 9th International Conference on Distributed Smart Cameras held last month in Seville, Spain. The final device is expected to be completed by September 2016.

Astronaut controls vision-guided robot from space

In September, astronaut Andreas Mogensen of the Danish European Space Agency (ESA; Paris, France; www.esa.int) performed force-feedback-based teleoperation of a vision-guided robotic arm system on Earth from the International Space Station. Nearly 250 miles from Earth, Mogensen took control of the Interact Centaur rover, which features a pair of robotic arms from KUKA (Augsburg, Germany; www.kuka.com). Featuring force-limited sensors, the arms can flex and adapt in a manner similar to human arms during remote control.

For the Interact Centaur's vision system, a head pan-tilt camera provides a general scene overview while a camera mounted on the right robotic arm is used to aid tool manipulation. Two cameras located on the front and back of the rover enable the operator to view an area otherwise occluded by the chassis.

Mogensen made use of haptic control, providing him with force feedback to sense when the robotic arms encounter resistance. Because of this, he could perform dexterous mechanical assembly tasks remotely from space.

"When humans perform precision operations, they rely largely on tactile feedback," says André Schiele, principal investigator of the experiment, head of ESA's Telerobotics and Haptics Laboratory and Associate of the Delft Robotics Institute (Delft, The Netherlands; http://robotics.tudelft.nl/).

"Without using haptic feedback, the operator must be careful not to damage something while the robot is in contact with its environment. As a result, simple tasks can take a very long time. Moreover, the tactile sensation derived from any task contains important information about the geometric relationship of the objects involved and therefore allows tasks to be executed more intuitively and significantly faster," he says.

Signals between the crew and the robot must travel a distance of approximately 56,000 miles via a satellite located in geostationary orbit. Despite this distance, Mogensen felt exactly what the robot felt on the surface, with only a slight lag. During the test, Andreas attempted to guide the robot to locate an "operations task board," and to remove and plug a metal pin into it.

Humanoid personal robot sells rapidly

Aldebaran (Paris, France; www.aldebaran.com), part of the SoftBank Robotic Holdings Group (Tokyo, Japan; www.softbank.jp/en/corp/group/sbr/), has sold 1,000 of its Pepper vision-guided humanoid robots within one minute of its initial launch. Alibaba Group Holding (Hangzhou, China; www.alibabagroup.com) and Foxconn Technology (Taipei, Taiwan; www.foxconn.com) have each invested $118 million for a combined 40% share of the SoftBank Robotic Holdings Group.

Pepper is a humanoid robot designed to interact in four languages: English, French, Japanese and Spanish. The robot's head is equipped with four microphones, two HD RGB cameras (in the mouth and forehead) and a 3D depth sensor behind the eyes. It also has a gyroscope in the torso and touch sensors in the head. Pepper's mobile base has two sonars, six lasers, three bumper sensors and a gyroscope.

Pepper costs $1,600 which does not include a monthly internet connection of $120 and a monthly maintenance contract of $80.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!