A preview of Microsoft’s Project Brainwave, a deep learning acceleration platform built with three layers designed for real-time artificial intelligence processing, has been released on the Azure cloud platformand on the edge.
Project Brainwave, according to Microsoft, makes Azure the fastest cloud to run real-time artificial intelligence (AI) and is now fully integrated with Azure Machine Learning. Supporting Intel FPGAs and ResNet50-based neural networks, the hardware is built with three main layers:
- A distributed system architecture
- A hardware deep neural network (DNN) engine synthesized onto FPGAs
- A compiler and runtime for low-friction deployment of trained models
Mark Russinovich, chief technical officer for Microsoft’s Azure cloud computing platform, said the preview of Project Brainwave marks the start of Microsoft’s efforts to bring the power of FPGAs to customers for a variety of purposes.
"I think this is a first step in making the FPGAs more of a general-purpose platform for customers," he said.
Microsoft is working with manufacturing solutions provider Jabil to see how Project Brainwave could be used to "quickly and accurately use AI to scan images and flag false positives," which would free people up who manually check for defects to focus on more complex cases. The preview version of Project Brainwave that is now available offers the ability for customers to perform similar "AI-based computations in real time, instead of batching it into smaller groups of separate computations."
This would be done in Google’s TensorFlow framework for AI calculations using deep neural networks. In addition, according to Microsoft, the company is working on building the capability to support Microsoft Cognitive Toolkit, another popular framework for deep learning. Microsoft is also offering a limited preview to bring Project Brainwave to the edge, which would enable users to take advantage of the computing speed offered by the framework in their own businesses and facilities, even if their systems aren’t connected to a network or the Internet.
"We’re making real-time AI available to customers both on the cloud and on the edge," said Doug Burger, Distinguished Engineer, Microsoft.
The public preview of Project Brainwave comes about five years after Burger, a former academic, first began talking about the idea of using FPGAs for more efficient computer processing. As he refined his idea, the current AI revolution kicked into full gear, according to Microsoft, which created a massive need for systems that can process the large amounts of data required for AI systems to do things such as scan documents and images for information, recognize speech, and translate conversations.
Additionally, Burger says that the framework is perfect for the demands of AI computing, and that the hardware design can evolve rapidly and be remapped to the FPGA after each improvement, "keeping pace with new discoveries and staying current with the requirements of the rapidly changing AI algorithms."
Burger wrote in a blog post in 2017that the system was for "real-time AI, which means the system processes requests as fast as it receives them, with ultra-low latency," and that "Real-time AI is becoming increasingly important as cloud infrastructures process live data streams, whether they be search queries, videos, sensor streams, or interactions with users."
Pictured: Doug Burger holds an example of the hardware used for Project Brainwave.
View more information on the Project Brainwave preview.