Ergo: Perceive’s Chip – Data Center-Class Inference in Edge Devices at Ultra-Low Power

May 17, 2021
Steve Teig, CEO of Perceive, presents the “Ergo: Perceive’s Chip – Data Center-Class Inference in Edge Devices at Ultra-Low Power” tutorial at the September 2020 Embedded Vision Summit.

To date, people seeking to deploy machine learning-based inference within consumer electronics have had only two choices, both unattractive. The first option entails transmitting voluminous raw data, such as video, to the cloud, potentially violating customers’ privacy, tempting hackers, and costing substantial energy, money, and latency. The second option runs at the edge, but on severely limited hardware, which can implement only tiny, inaccurate neural networks (e.g., MobileNet) and runs even those tiny networks at low frame rates.

Solving this dilemma, Perceive’s new chip, Ergo, runs large, advanced neural networks at high speed for imaging, audio, language, and other applications inside edge devices without any off-chip RAM. Even large networks, such as YOLOv3 with more than 64 million weights, can run at ~250 fps (with batch size 1). Moreover, Ergo can run YOLOv3 at 30 fps in about 20 mW (i.e., more than 50x more power-efficiently than competing devices).

Click here for a PDF of the slides.

Register HERE for the 2021 Embedded Vision Summit.

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!