Unmanned systems: National drone safety, new research into UAV flight systems, and improving AI decision-making

June 28, 2019
A new partnership for national drone safety, increasing the sophistication of autonomous flight systems in difficult environments, and new research into AI perception and decision-making.

In this week’s roundup from the Association for Unmanned Vehicle Systems International, which highlights some of the latest news and headlines in unmanned vehicles and robotics, a new partnership for national drone safety, increasing the sophistication of autonomous flight systems in difficult environments, and new research into AI perception and decision-making.

Raytheon, AirMap collaborating to safely integrate UAS into national airspace system

Raytheon and AirMap have announced that they will collaborate on future projects to safely integrate UAS into the national airspace system, and “unlock the positive economic and social benefits of expanded commercial drone operations.”

The agreement combines the expertise of each company. Air traffic controllers across the U.S. use Raytheon's Standard Terminal Automation Replacement System (STARS) to provide safe and efficient aircraft spacing and sequencing guidance for more than 40,000 departing and arriving aircraft daily at both civilian and military airports.

Meanwhile, AirMap is the leading global provider of airspace intelligence for UAS operations with more than 250,000 registered users. Last year, the majority of U.S. registered commercial UAS pilots used AirMap to request more than 45,000 automated authorizations to fly in controlled airspace.

“AirMap is ushering in a new era in drone aviation,” says Matt Gilligan, vice president of Raytheon's Intelligence, Information and Services business.

“Drones must safely operate in an already complex ecosystem, which is where our experience matters.”

To help ensure overall safety of the airspace, Raytheon and AirMap are working toward an integrated demonstration that will showcase how AirMap's UAS traffic management platform can increase air traffic controllers' awareness of potential conflict between UAS and manned aircraft near airports.

“Raytheon technology has helped safely and effectively manage airspace in the most complex, dense controlled airspace in the world for decades,” says Ben Marcus, AirMap co-founder and chairman.

“They are an ideal partner to join AirMap on the path toward enabling safe, efficient, and scalable drone operations in U.S. low-altitude airspace between 0 and 400 feet.”

Scientists investigate framework for self-guided drone navigation in ‘cluttered’ unknown environments

Recently, scientists at Intel Labs and Mexico's Center for Research and Advanced Studies of the National Polytechnic Institute investigated a framework for self-guided UAS navigation in "cluttered" unknown environments.

According to the team of scientists, its real-time, on-device family of algorithms achieved "state-of-the-art" performance during both qualitative and quantitative tests involving Intel's Ready to Fly drone kit.

“Autonomous navigation in unknown cluttered environments is one of the fundamental problems in robotics with applications in search and rescue, information gathering and inspection of industrial and civil structures, among others,” the coauthors wrote in a paper entitled “Autonomous Navigation of MAVs in Unknown Cluttered Environments,” which describes their work.

“Although mapping, planning, and trajectory generation can be considered mature fields considering certain combinations of robotic platforms and environments, a framework combining elements from all these fields for [drone] navigation in general environments is still missing.”

The algorithmic framework developed by the team of scientists is designed for UAS equipped with 3D sensors and odometry modules. The framework is made up of an algorithm that produces maps of the disparity between measurements obtained from the drone’s depth sensor, a path generation model that takes into account field-of-view constraints on space that’s assumed to be safe for navigation, and a model that generates robust motion plans.

At the mapping stage, algorithms “compute a point cloud from the disparity depth images and the odometry,” and add it to a map representation of the occupied space of the UAS. During the path planning, an exploration action is generated, and in the next phase, the framework creates a trajectory that drives the robot from its current state to the next planned action.

While this is taking place, the models attempt to make sure that the drone’s yaw orientation—"the way it twists or oscillates around a vertical axis”— is aligned with the direction of motion, primarily by using a velocity-tracking yaw approach.

The researchers performed experiments both in four real-world environments and in virtual environments to test the robustness of their framework. To do this, they used an open source robotics middleware called Robotic Operating System Kinetic.

The researchers say that in one of the tests, the framework achieved a motion time of 3.37 milliseconds compared with the benchmark algorithms’ 103.2 milliseconds and 35.5ms. Additionally, its average mapping time was 0.256 milliseconds against 700.7ms and 2.035ms.

The team points out that its algorithm tended to generate slightly larger paths than the benchmarks against which they were tested. They also noted that the algorithm wasn’t able to reach goal destinations in a maze simulation with very tight spaces, which is a result of a failure to account for yaw dynamical constraints in the planning stage.

The team adds, though, that its work could lead to systems that integrate trajectory tracking and prediction of dynamic obstacles, which could help future UAS navigate more effectively in crowded environments.

Carnegie Mellon University, Argo AI establish center for autonomous vehicle research

Carnegie Mellon University (CMU) and Argo AI have announced a five-year, $15 million sponsored research partnership that will result in Argo AI funding research into advanced perception and next-generation decision-making algorithms for autonomous vehicles.

CMU and Argo AI will establish the Carnegie Mellon University Argo AI Center for autonomous vehicle research. Through advanced research projects, the center will seek to help overcome the hurdles associated with enabling self-driving vehicles to operate in various real-world conditions.

“We are thrilled to deepen our partnership with Argo AI to shape the future of self-driving technologies,” says CMU President Farnam Jahanian.

“Together, Argo AI and CMU will accelerate critical research in autonomous vehicles while building on the momentum of CMU's culture of innovation.”

The principal investigator for the center will be Deva Ramanan, an associate professor in CMU’s Robotics Institute, who also serves as machine learning lead at Argo AI. The Robotics Institute will serve as the location where research is conducted. Faculty members and students throughout CMU will participate in the research.

Through the center, students will have access to a variety of things, including fleet-scale data sets, vehicles and large-scale infrastructure that are key to the advancement of self-driving technologies, and would otherwise be difficult to obtain.

The research will address several technical topics including smart sensor fusion, 3D scene understanding, and behavioral prediction. Research findings will be reported in open scientific literature, and the entire field will be able to use it.

With more than 30 years’ experience in developing autonomous driving technology, CMU’s expertise and its graduates have attracted several self-driving car companies to establish operations in Pittsburgh. Argo AI was founded by a team of CMU alumni and experts from across the industry in 2016.

“Argo AI, Pittsburgh and the entire autonomous vehicle industry have benefited from Carnegie Mellon's leadership,” says Bryan Salesky, CEO and co-founder of Argo AI.

“CMU and now Argo AI are two big reasons why Pittsburgh will remain the center of the universe for self-driving technology.”

Share your vision-related news by contacting Dennis Scimeca, Associate Editor, Vision Systems Design

SUBSCRIBE TO OUR NEWSLETTERS

Voice Your Opinion

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!