IMAGE CAPTURE: Smart spheres target consumer, surveillance applications

Developers of consumer products such as mobile telephones and portable gaming devices have been quick to use the latest low-cost CMOS imagers, wireless transmitters, and accelerometers in their products. Today, this technology is also being used in a number of smart spheres that embody cameras into ball-sized devices that, while in flight, can capture images at customized time intervals or triggered on impact with a fixed surface.

Notable products include the Flee, a camera conceived by Turkish designer Hakan Bogazpinar ( that combines a digital camera and a Bluetooth receiver. Appearing somewhat like a shuttlecock, the ball can be programmed to take images at custom time intervals and transmit them to a mobile telephone.

Similar cameras, such as the Satugo camera conceived by Danish designers Eschel Jacobsen and Mads Ny Larsen (, use an internal timer to trigger the ball’s camera in flight. Alternatively, the camera can be triggered upon impact with a surface.

“In the past, such technologies targeted consumer applications.” says Steven Hollinger, president of S.H. Pierce and Co. (Boston, MA, USA; “Everything has changed with the advent of microchip accelerometers, gyros, and GPS modules capable of being packaged in an impact-resistant housing.” Hollinger has filed for patent protection and is developing a smarter sphere that crams more functionality and capability into his product (see figure).

Conceptual rendering of the proposed smart sphere patented by S.H. Pierce and Co. By incorporating a CMOS imager and a GPS, the ball could capture a sequence of images at specific points in its trajectory, such those available from its apogee.
Click here to enlarge image

Like other ball-based camera systems, Hollinger’s first product will include a CMOS imager to capture images as the ball moves through its trajectory. This, however, is where the similarity to other commercially available products ends.

“Commercially available products must capture images from a random orientation since the position of the ball in space is not known,” he says. To overcome this, Hollinger plans to incorporate a trajectory trigger within the ball that senses its in-flight position, acceleration, and rotational velocity.

“This trajectory trigger,” says Hollinger, “will be implemented using a combination of accelerometers and GPS technology to determine the ball’s precise location and orientation while in flight.”

By incorporating a GPS, the ball could track its own absolute position relative to the Earth, make point-to-point calculations, and extrapolate valuable in-flight information. Using the information, for example, a ball in flight would be able to capture a sequence of images at specific points in its trajectory, such as those available from its apogee.

In addition to processing location and orientation data, the trajectory trigger’s image-processing capabilities enable the ball to make real-time decisions based on the content of images captured while in flight—information which could be used to maintain focus on a subject of interest, a known topographical feature, or a recognizable pattern.

“By culling and compiling a series of discrete frames taken exclusively from a desired perspective,” says Hollinger, “the operator could be presented with a continuous video flyover of a ground-based subject irrespective of the ball camera’s orientation in space.”

While developing the product, however, Hollinger realized that balls traveling at higher velocities offered greater potential for reconnaissance applications, specifically in the need to be able to dynamically control and change the trajectory of the smart sphere while in flight.

The product will incorporate a means for deforming the exterior surface of the ball from within, thereby altering its aerodynamic profile to change trajectory. Because the ball is in free rotation, an on-board exterior surface-deformation trajectory changer, accessing information from the GPS, will control a series of mechanical actuators to create a ripple of changes from point to point along the exterior surface.

“Just as a golf ball’s dimples reduce drag by creating a thin layer of turbulence, dimple contours can be rippled according to logic that considers the ball’s position, rotation, and other in-flight parameters. These dimples act as microspoilers, providing a means to change direction and lift,” Hollinger says.

Although Hollinger’s first prototype may not include a trajectory control module, the applications for a device with intelligent reconnaissance capabilities falls further than the consumer market. If produced at a low enough cost, in military applications the devices may be used for surveillance applications, providing troops with multiple views across a battlefield terrain.


Vision technologies for robotics: Application do’s and don’ts

This webcast will offer tips and examples for integration of machine vision systems in robotics applications. Expert Jeff Boeve of JR Automation will explain how to clearly define your pass/fa...
December 9, 2014

Solving factory automation challenges with machine vision

What do you need to know to implement your machine vision setup for industrial automation? This webcast will answer that question using real-world application examples—such as inspection, assembly,...
November 18, 2014

Performing effective 2D and 3D pattern matching in machine vision applications

This webcast, sponsored by MVTec, will explain how pattern matching works and in what applications is being used.

October 30, 2014

Overcoming the Limitations of Vision Systems in Manufacturing

Expert speaker Jim Blasius, Solutions Architect, Measurement & Automation Product Group at ADLINK Technology will examine the pros and cons of different compact vision systems, discuss current ...
October 28, 2014


Click here to view archived Vision Systems Design articles