Military Embedded Systems

Navy teams demo enhanced situational awareness for UAVs at innovation challenge

News

November 29, 2016

Mariana Iriarte

Technology Editor

Military Embedded Systems

Navy teams demo enhanced situational awareness for UAVs at innovation challenge
Photo by U.S. Navy

NAVAL AIR WARFARE CENTER WEAPONS DIVISION, CHINA LAKE, Calif. For the Naval Air Systems Command?s second innovation challenge, two teams presented their unmanned aerial vehicle (UAV)-focused projects designed to enhance the warfighter?s situational awareness.

The Naval Air Warfare Center Weapons Division teams – the Urban Reconnaissance and Situational Awareness (URSA) team is located at Point Mugu and the Swarm View team is located at China Lake - wrapped up their projects within a six-month period and a $25 thousand budget.

Point Mugu’s URSA team – composed of Michael McConnehey, Christopher Gudea, Shivneil Prasad, Georgi Ivanov, Cory Eighan, and Christopher Smith - worked to reduce or entirely eliminate urban combat casualties using an UAV equipped with a camera to gain situational awareness in GPS-denied environments. The UAV then used algorithms to provide the user with a 3-D virtual reality map in real-time.

“Innovation is really just pushing the envelope and trying to do new things in new ways,” says Michael McConnehey of the URSA team. “Sometimes, it’s taking something that feels old and familiar, repackaging it and retooling it for a completely new use. The Innovation Challenge presented an opportunity to try some of our ideas that hadn’t been fully funded and we saw a problem that we thought that we could offer an innovative solution to.”

China Lake’s Swarm View team – members included Jeremy Siedschlag, Melvin Deberry, Joshua Elroy, Christopher Yelton, Clayton Anderson, and Nicholas LeBlanc - built a UAV platform to simulate autonomous swarm coordination using a mesh network to perform object registration within the platforms' environment. The swarm of UAVs equipped with cameras enables the warfighter to review updated images containing objects of interest and provide information about the distance to that object despite the sensors and the object of interest being in motion in a cluttered environment.

“The main benefit is a hyper-agnostic sensor suite that allows the warfighter to focus their attention on something else while the swarm looks around a predefined area,” Siedschlag says. “Ideally, there would be one operator and six drones allowing the user to make better use of their time since a lot money is spent on labor.”

For future participants, Gudea suggests to “Take a lot of risks, fail fast, and fail early. When we did our project, we tried several different cameras, onboard processors and frame sizes and ended up getting rid of a lot of them immediately because they didn’t work for us. Be willing to take risks and be willing to fail. The hard part is knowing when you should quit on the areas that are failing, but the challenges are what make you better in your craft.”

Read more on unmanned systems:

New RF tuners drive the design of multichannel signal analyzers for SIGINT

5D Robotics & Charles River Analytics team up to develop soldier-machine interface

DARPA's cross-program testing demonstrates sensor payload