Mobile robotic development kit enables rapid development and testing

Aug. 5, 2014
SentiBotics Development Kit is a mobile robotic platform with a 3D vision system, modular robotic arm, and proprietary software that enables the rapid development and testing of mobile robots.

 

SentiBotics Development Kit is a mobile robotic platform with a 3D vision system, modular robotic arm, and proprietary software that enables the rapid development and testing of mobile robots.

The development kit from biometrics and artificial intelligence company Neurotechnology comes with a number of standard features, including a two-camera 3D vision system. The first camera, a Softkinetic DS325, is a 3D time of flight camera that is used on the arm of the robot. The DS325 features a 320 x 240 pixel CMOS sensor (called the DepthSense) for 3D scene data, and RGB and depth lenses that focus reflected light onto the DepthSense sensor. It also has diffused laser illumination, a maximum frame rate of 60 fps, and a USB interface.

The other camera used in the system, which is used for navigation, is an ASUS Xtion Pro Live, which is similar to the Microsoft Kinect. The ASUS Xtion Pro Live features a color and range sensor captures 640 x 480 images at 30 fps and 320 x 240 images at 60 fps, and has an effective range of 0.8 to 4 meters. Furthermore, the Xtion Pro Live features a USB 2.0 and USB 3.0 interface and has a field of view of 58° H, 45° V, and 70° D.

In addition to the 3D vision system, the SentiBotics kit features Neuretechnology-developed Robot Operating System-based software. The kit includes ROS-based infrastructure which enables users to integrate third-party hardware parts or robotic algorithms. Algorithms used in the kit include an autonomous navigation algorithm, an object recognition algorithm, and an object manipulation algorithm. The source code for each of these algorithms—which is written in C++ and designed to be run on the specified robotic hardware—is included in the kit.

Neurotechnology’s kit also comes with programming samples that demonstrate a number of capabilities, including how to teach the robot to recognize and grasp objects, and how to use the robot to create a map of an environment that can be used for autonomous navigation. Also included in the robot’s hardware are an Intel NUC i5 on-board computer and a control pad for manual control of the robot and its arm.

"We designed SentiBotics to be a compact, integral and computationally capable robotics system that allows our customers to rapidly test their ideas in real world environments," said Dr. Povilas Daniusis, leader of the Neurotechnology robotics team in a press release. "With the included software, SentiBotics not only provides working examples of autonomous navigation, object recognition and grasping algorithms, it also allows users to immediately concentrate on their own algorithm development."

View more information on the SentiBotics Development Kit.

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox, click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter | Check us out on Google +

About the Author

James Carroll

Since joining the team 2013, James covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles for each issue of the magazine, James managed the Innovators Awards program and webcasts.


Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!