Deep learning accelerator and neural network software framework introduced by Movidius

May 16, 2016
Movidius has announced the release of the Fathom Neural Compute Stick—a deep learning acceleration module—as well as Fathom deep learning software framework, both of which will enable neural networks to be moved out of the cloud and deployed natively in end-user devices. 

Movidius has announced the release of the Fathom Neural Compute Stick—a deep learning acceleration module—as well as Fathom deep learning software framework, both of which will enable neural networks to be moved out of the cloud and deployed natively in end-user devices.

The Fathom Neural Compute Stick, a USB stick embedded neural network accelerator, features the Myriad 2 vision processor and can run fully-trained neural networks at under 1 Watt of power. Targeted at deep learning product developers and researchers, the Fathom Neural Compute Stick accepts networks defined by Caffe or TensorFlow and their accompanying dataset. It then uses the Movidius Fathom Tool to prepare and execute theconvolutional neural network on Myriad 2.

When connected to a PC, the Fathom Neural Compute Stick behaves as a neural network profiling and evaluation tool, meaning companies will be able to prototype faster and more efficiently, reducing time to market for products requiring artificial intelligence.

"As a participant in the deep learning ecosystem, I have been hoping for a long time that something like Fathom would become available," said Founding Director of New York University Data Science Center, Dr. Yann LeCun. "The Fathom Neural Compute Stick is a compact, low-power convolutional net accelerator for embedded applications that is quite unique. As a tinkerer and builder of various robots and flying contraptions, I've been dreaming of getting my hands on something like the Fathom Neural Compute Stick for a long time. With Fathom, every robot, big and small, can now have state-of-the-art vision capabilities."

With the Fathom, software framework trained neural networks are translated from a PC environment to an embedded environment.

"Deep learning has tremendous potential -- it's exciting to see this kind of intelligence working directly in the low-power mobile environment of consumer devices," said Pete Warden, lead for Google's TensorFlow mobile team. "With TensorFlow supported from the outset, Fathom goes a long way towards helping tune and run these complex neural networks inside devices."

Movidius CEO, Remi El-Ouazzane, also commented on the product release: "It’s going to mean that very soon, consumers are going to be introduced to surprisingly smart applications and products. It means the same level of surprise and delight we saw at the beginning of the smartphone revolution; we’re going to see again with the machine intelligence revolution. With more than 1 million units of Myriad 2 already ordered, we want to make our VPU the de-facto standard when it comes to embedded deep neural network."

View more information onFathom.

Share your vision-related news by contactingJames Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox,click here.

Share your comments, tips, or questions in our comments section below.

Join ourLinkedIn group | Like us on Facebook | Follow us on Twitter

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!