In this week’s roundup from the Association for Unmanned Vehicle Systems International, which highlights some of the latest news and headlines in unmanned vehicles and robotics, a AUV makes a major scientific discovery in the Antarctic Ocean, Argo AI and Ford's third-gen autonomous test vehicle hits the road, and veterans get training to become UAS operators.
Boaty McBoatface makes significant climate change discovery
During its maiden voyage, Boaty McBoatface, the lead boat of the Autosub Long Range-class of AUVs used for scientific research, made a major discovery regarding climate change.
Over the course of three days in April 2017, Boaty discovered a significant link between Antarctic winds and rising sea temperatures.
Traveling more than 111 miles through mountainous underwater valleys, Boaty measured the temperature, saltiness and turbulence of the water at the bottom of the ocean. The AUV used an echo sounder to navigate depths as low as 2.49 miles before reaching a programmed destination point to be recovered.
The goal of the research was to study the changing temperatures at the bottom of the Southern Ocean. The findings of the research were published in the Proceedings of the National Academy of Sciences of the United States of America, a multidisciplinary journal.
“The data from Boaty McBoatface gave us a completely new way of looking at the deep ocean — the path taken by Boaty created a spatial view of the turbulence near the seafloor,” says Dr. Eleanor Frajka-Williams of the National Oceanography Centre in Southampton, England.
With the data, experts will be able to better predict how climate change will impact rise in sea levels.
"This study is a great example of how exciting new technology such as the unmanned submarine 'Boaty McBoatface' can be used along with ship-based measurements and cutting-edge ocean models to discover and explain previously unknown processes affecting heat transport within the ocean," says Dr. Povl Abrahamsen of the British Antarctic Survey in in Cambridge, England.
Argo AI, Ford launch third-generation self-driving test vehicle
Argo AI and Ford have announced the launch of their third-generation self-driving test vehicle, the new Ford Fusion Hybrid.
The vehicles will be deployed in all five cities that Argo AI and Ford are operating in, which now includes Detroit.
The vehicles are equipped with new technology that’s a step closer to production specification, as well as modifications that are designed to make sure that they operate safely in a variety of conditions.
The vehicles are also equipped with a significantly upgraded sensor suite, which includes new sets of radar and cameras with higher resolution and higher dynamic range.
The new fleet also features a brand-new computing system that offers a great deal more processing power than in the companies’ previous cars, with improved thermal management systems that generate less heat and noise inside the vehicle. This results in a smarter vehicle that is also quieter and more comfortable to ride in, according to Peter Rander, president, Argo AI.
In an effort to ensure that the vehicles continue operating safely in the event that something unexpected occurs, the vehicles now feature redundant braking and steering systems, which help maintain vehicle motion control in the event one of the units stops functioning.
“These types of redundant systems are included to help ensure the safe deployment of self-driving vehicles, granting them the ability to detect faults and preserve their ability to safely stop or pull over as needed,” Rander says.
By testing the vehicles in Detroit, Argo AI and Ford will have the opportunity to learn how the vehicles operate in a new type of environment.
“Every city represents a unique opportunity to make our self-driving system smarter because of the exposure to different road infrastructure design, driving behavior and even traffic light placement,” Rander says. “The collective knowledge we’re gaining by operating in five very different locales is a big part of the reason why we’re making great progress.”
For Rander, one of the benefits of testing the vehicles in Detroit is that the city offers roads that don’t have a singular defining feature. For instance, some Detroit streets are wide and can often have unmarked lanes, so the vehicles will be tasked with the challenge of having to reason through how to navigate while predicting what other drivers may do, so that they don’t cause unnecessary congestion.
Other residential streets are narrow two-lane roads with cars parked on either side. This, combined with overhanging tree branches, “which we don’t often see in other urban environments,” Rander notes, creates a “very dynamic situation.” Throw in pop-up construction that’s occurring across the city and “you’ve got a diverse, condensed training ground that really informs our development efforts,” Rander adds.
Wounded Eagle UAS uses UAS to map VA West Los Angeles Medical Center campus
Wounded Eagle (WE) UAS Inc., a veteran run non-profit organization that trains disabled veterans to become experienced and skilled FAA Part 107 UAS operators, has announced that with help from its student UAS operators, it completed aerial mapping of VA Greater Los Angeles Healthcare System's (VAGLAHS) 388-acre VA West Los Angeles Medical Center (WLA) campus.
WE used a UAS and UAS mapping software to capture thousands of images of WLA. The images were stitched together to make a three zone orthomosaic map. WE also created 3D renderings of several historical buildings including the Wadsworth Chapel and Brentwood Theater.
“A truly great opportunity was given to Wounded Eagle UAS by the VA,” says Joseph Dorando, the team lead and remote pilot in charge of UAS operations.
“Our student teams are all disabled veterans and to provide them the capability to learn how to make maps using UAS was a real experience for them and gave them the confidence to do more and go beyond.”
Together, WE and the VAGLAHS Emergency Management Office documented existing facilities and created high-resolution imagery of the entire campus for historical purposes.
The image collections were divided into three zones with the option to merge all three collections together to make one large orthomosaic photogrammetric map of the campus. The resolution is .5 cm per pixel, which allows users to zoom in and view fine details of the buildings such as individual rooftop tiles.