Experimental Camera with Omnidirectional and Amphibious Imaging Capabilities

Oct. 13, 2022
The camera system is based on the fiddler crab, which has an amphibious imaging ability and a 360° field of view (FOV).

Researchers in the United States and the Republic of Korea have designed an experimental camera with an omnidirectional imaging ability that can operate in both aquatic and terrestrial environments.

“This approach to artificial vision could be used to develop imaging systems for panoramic motion detection and obstacle avoidance in variable environments,” the researchers wrote in an article they published in Nature Electronics (https://go.nature.com/3rHAHG6). 

For example, the system could be useful for unmanned vehicles that require “novel imaging systems with a small form factor, low power consumption, wide field of view, and no optical distortion,” says Gil Ju Lee, PhD, Assistant Professor in the Advanced Photonics and Optoelectronics Laboratory in the Department of Electronics Engineering, Pusan National University. (Busan, Republic of Korea; www.pusan.ac.kr/kor/Main.do) and co-first author on the study.

The experimental camera is based on the fiddler crab, which has an amphibious imaging ability and a 360° field of view (FoV). The crab’s ellipsoidal eye stalk enables panoramic imaging, while the flat-face lenses allow amphibious imaging, the researchers say. The experimental camera, which the researchers refer to as an "artificial vision system", does not have the imaging resolution of a commercial industrial camera; they use the term "artificial vision system" to distinguish their experimental device from those other cameras, Lee says. 

How it Works

To mimic the fiddler crab’s vision, the researchers developed a camera that comprises an array of flat micro-lenses with a graded refractive index (RI) profile integrated into a flexible comb-shaped silicon photodiode array, which converts photons to electrical current.

The researchers created a total of four of these comb-shaped image sensors with microlens arrays, which they then placed inside the wedged grooves of a spherical structure—two on the top hemisphere and two on the bottom hemisphere.  They created the sphere, which was 2 cm in diameter, using a 3D printer.

Each image sensor array has eight comb-like subunits, and each subunit consists of eight pixels for a total of 128 pixels per hemisphere, or 256 pixels for the entire sphere, according to the journal article.  

They developed numerous simulations to test the artificial vision system in both air and water. To mimic the fiddler crab’s aquatic environment, the researchers suspended the spherically shaped camera in a clear resin container, also created with the 3D printer and filled halfway with water.

To test the imaging abilities of their system, the researchers projected simple, line-drawn images at different angles and distances from the spherical camera using shadow masks placed in front of a light source with wavelengths of 450, 532, and 635 nm. The MetaBright Thin Backlight LED sources of illumination were from Metaphase Lighting Technologies (Bristol, PA, USA; www.metaphase-tech.com).

“Since our device can focus the incident light by flat-top/graded RI micro-lens and comb-shaped photodiodes, a commercial object lens and image sensor were not required,” explains Lee.

For an experiment to test the imaging capabilities in aquatic versus terrestrial environments, the researchers projected two images—one at the top (dry) part of the spherical camera and one at the bottom (wet) part of the spherical container. The images were 9 mm away from the camera. The top image was a simple line drawing of a square, while the bottom image was a line drawing of a circle.

To test the omnidirectional imaging features, the researchers projected five line-drawn images (dolphin, plane, boat, fish, and submarine) from different angles of -90° to 90° with intervals of 30°.

The system demonstrated consistent imaging results in both air and water without distortions in the images and with a wide FoV, according to the researchers.

This performance is significantly better than what typically occurs with conventional micro-lenses, which lose their ability to focus incoming light when the environment changes between air and water, Lee says.

About the Author

Linda Wilson | Editor in Chief

Linda Wilson joined the team at Vision Systems Design in 2022. She has more than 25 years of experience in B2B publishing and has written for numerous publications, including Modern Healthcare, InformationWeek, Computerworld, Health Data Management, and many others. Before joining VSD, she was the senior editor at Medical Laboratory Observer, a sister publication to VSD.         

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!