Google Clips camera uses machine learning to capture spontaneous moments in everyday life

Oct. 4, 2017
Google has announced the release of Google Clips, a small, hands-free camera that uses a machine learning algorithm to look for good moments to capture in everyday life.

Google has announced the release of Google Clips, a small, hands-free camera that uses a machine learningalgorithm to look for good moments to capture in everyday life.

An image sensor size or model is not named—though The Verge is reportingthat a 12 MPixel sensor is being used—but the camera features a 1.55 µm pixel size, auto focus adjustment, a 130° field of view, a frame rate of 15 fps, auto low lux and night mode, 16 GB storage, as well as motion photos (JPEGS with embedded MP4s), MP4, GIF, and JPEG, with no audio. Additionally, the camera has Gorilla Glass 3 for durability, as well as USB-C, Wi-Fi Direct, and Bluetooth LE for connectivity.

Running on Android, Clips also features Moment IQ, a machine learning algorithm that Google says is smart enough to recognize great expressions, lighting, and framing, while always learning. As more images are captured with Clips, the camera will learn to recognize the faces of people that matter to you and will help capture more moments with them, according to Juston Payne, Google Clips Product Manager.

Google Clips’ algorithm runs in real time directly on the camera with the Movidius Myriad 2 vision processing unit (VPU), which the company calls the industry's first "always-on vision processor. This VPU contains hybrid processing elements including 12 128-bit VLIW processors called SHAVEs, and an intelligent memory fabric that pulls together the processing resources to enable power-efficient processing. It also has two CPUs and a software development kit to incorporate proprietary functions.

"To bring our vision for Clips to life, we’ve been working on integrating Intel’s Movidius technology within Clips to give people many of the benefits of machine learning directly on their device," said Google Clips product lead Juston Payne."On-device processing gives people a lengthy battery life, speedy access to their clips, and the ability to use the camera without an internet connection. We can’t wait for parents and pet lovers to start effortlessly capturing spontaneous moments, while getting to stay in the moment."

Remi El-Ouazzane, vice president and general manager of Movidius, Intel New Technology Group, also commented: "In our collaboration with the Clips team, it has been remarkable to see how much intelligence Google has been able to put right into a small device like Clips," he said. "This intelligent camera truly represents the level of onboard intelligence we dreamed of when developing our Myriad VPU technology."

Clips taken from the camera sync wirelessly and, in seconds, from the camera to the Google Clips app for Android or iOS. Privacy issues are addressed in the blog post written by Payne. In it, he explains that the camera lights up so people will know when it is on and capturing, as to make it clear to people when is being used.

Google Clips will be available soon in the U.S. for $249, with the first edition being designed specifically with parents and pet owners in mind.

View more information on Google Clips.
View the blog post on Google Clips.

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox,
click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!