PARIS—Prophesee SA has announced that the newest release of its Metavision® Intelligence suite will be offered—in its entirety and for all modules—at no cost, delivering an accelerated path to explore and implement differentiated machine vision applications that leverage the performance and efficiency of event-based vision. The suite of software tools and code samples will be available for free from initial adoption, use through commercial development and release of market-ready products.
With this advanced toolkit, engineers can develop computer vision applications on a PC for a wide range of markets, including industrial automation, IoT, surveillance, mobile, medical, and automotive.
The free modules in Metavision Intelligence 3.0 are available through C++ and Python APIs and include a comprehensive machine learning toolkit. The suite also offers a no-code option through the Studio tool which enables users to play pre-recorded datasets provided for free, without owning an event camera. With an event camera, users can stream or record events from their event camera in seconds.
In total, the suite consists of 95 algorithms, 67 code samples, and 11 ready-to-use applications. Plug-and-play-provided algorithms include high-speed counting, vibration monitoring, spatter monitoring, object tracking, optical flow, ultra-slow-motion, machine learning and others. It provides users with both C++ and Python APIs as well as extensive documentation and a wide range of samples organized by its implementation level to incrementally introduce the concept of event-based machine vision.
The latest release includes enhancements to help speed up time to production, allowing developers to stream their first events in minutes, or even build their own event camera from scratch using the provided camera plugins under open-source license as a base.
They now also have the tools to port their developments on Windows or Ubuntu operating systems. Metavision Intelligence 3.0 features also allow access to the full potential of advanced sensor features (e.g. anti-flickering, bias adjustment) by providing source code access to key sensor plugins.
The Metavision Studio tool has also enhanced the user experience with improvements to the onboarding guidance, UI, ROI and bias setup process.
The core ML modules include an open-source event-to-video converter, as well as a video-to-event simulator. The event-to-video converter utilizes the pretrained neural network to build grayscale images based on events. This allows users to make the best use of their existing development resources to process event-based data and build algorithms upon it.
The video-to-event pipeline breaks down the barrier of data scarcity in the event-based domain by enabling the conversion of conventional frame-based datasets to event-based datasets.
Developers can download the Metavision Intelligence Suite and begin building products leveraging Prophesee sensing technologies for free.
Visit https://www.prophesee.ai/metavision-intelligence for more information..
Prophesee is the inventor of neuromorphic vision systems. The company developed an Event-Based Vision approach to machine vision. This new vision category allows for significant reductions of power, latency, and data processing requirements to reveal what was invisible to traditional frame-based sensors until now. Prophesee’s patented Metavision® sensors and algorithms mimic how the human eye and brain work to dramatically improve efficiency in areas such as autonomous vehicles, industrial automation, IoT, mobile and AR/VR. Prophesee is based in Paris, with local offices in Grenoble, Shanghai, Tokyo and Silicon Valley.
Learn more at www.prophesee.ai.