Sensor payloads for military unmanned systems get smarterStory
May 06, 2020
Size, weight, power, and cost (SWaP-C) considerations are still driving military sensor payload designs for unmanned systems, but sensors are getting much “smarter” and processing tasks are now increasingly being performed right at the payload level.
Sensor payloads for military unmanned systems – both unmanned aircraft systems (UASs) and ground systems – are evolving rapidly as they embrace artificial intelligence (AI) and provide processing capabilities. The overall goal behind designing sensor payloads is the delivery of persistent situational awareness to the warfighter.
Military sensor payloads tend to run the gamut from optical sensors to radars to signal intelligence to a communications relay.
“Cameras, antennas, and field-programmable gate array (FPGA)/signal processing electronics are the hardware building blocks, with software and visualization being key to deliver a great user experience to the warfighter,” explains Satish Krishnan, vice president of Mission Payloads & Exploitation for General Atomics Aeronautical Systems Inc. (GA-ASI – Poway, California).
Along with the evolution of machine learning and artificial intelligence algorithms, there’s currently an emphasis on a high degree of automation in development as it applies to sensor interaction and information delivery to the user. “Robust datalinks play a critical role in information dissemination to where and when it’s needed, even within challenging, contested environments,” Krishnan says.
FLIR Systems (Arlington, Virginia) makes products that enhance perception and awareness; the company’s thermal-camera cores are at the heart of many sensor payloads to provide enhanced awareness and nighttime operation.
“We also make sensors for chemical, biological, radiation, nuclear, and explosives (CBRNE) detection,” says Tung Ng, director of engineering for FLIR Unmanned Ground Systems. “We’ve integrated them on our unmanned systems as well, and there’s a growing need for that kind of detection. But these days, it’s not enough for sensors to just send the data out. Customers are expecting sensors to be smarter, which means including processing as part of the sensor payload.” (Figure 1.)
Figure 1 | An example of a system with multiple CBRNE sensors is the FLIR Centaur, a Man Transportable Robotic System Increment II (MTRS Inc II) solution. Remotely operated, the system provides a standoff capability to detect, confirm, identify, and dispose of hazards. It is currently only available to the U.S. Department of Defense (DoD). Photo courtesy of FLIR Systems.
There’s also a trend toward military sensor payloads using standardized interfaces – interoperability profiles (IOPs) – for unmanned ground systems that require both more intelligence and processing at the sensor payload.
The trend toward shrinking size, weight, power, and cost (SWaP-C) – with a heavy emphasis on the reduction of cost – for military uses has been going on for years.
“Customers are increasingly requiring UAS to perform multiple roles efficiently,” Krishnan says. “Since endurance is arguably the most important attribute of a UAS, designing systems that are efficient in terms of size and weight are very important, so multiple sensors can be carried at the same time.”
As Moore’s Law continues to drive down the size of electronics, “applications that used to require racks of equipment are now becoming extremely small,” Krishnan points out. “This is enabling UASs to perform missions that in the past required large wide-bodied aircraft.”
Signal intelligence (SIGINT) and communications relay are examples in which the reduction in size and weight of electronics are leading to a new class of affordable applications on UASs from GA-ASI. “Even with the reduction in size, certain applications like electronic warfare (EW) jamming requires lots of prime power from the platform to perform the mission effectively,” Krishnan says. “So it’s critical for UASs to have tens of kilowatts of prime power available to power all the different payloads.”
Another SWaP-C trend is to reuse the same hardware – for example, antenna apertures – to do more with different adaptive software loads dictating what the sensor does.
“The military is continuing to push the cost angle,” Ng notes. “We’re also seeing an ongoing trend toward weight and size reduction. Sensor payloads can do a lot more in the same size now than they could even just a few years ago, which makes the design more challenging.”
Sensor system design challenges
Sensor systems pose a wide variety of design challenges for both UASs and ground systems.
Within the UAS realm, “every sensor system brings unique design considerations into play – from environmental considerations including thermal management to the unique radio frequency/cosite compatibility considerations to how data is displayed and distributed,” Krishnan says.
While the integration aspects of sensor systems onto UAS pose challenges that need to be overcome, “an oft-ignored aspect that needs to be well-designed up front is the mechanism to display and disseminate the data to the users that need the information,” Krishnan points out. “Making the warfighter user experience seamless is probably the biggest design challenge that sensor systems pose.”
On the ground end of things, “packing sensors tightly is definitely a challenge, but primarily because of the thermal management, interference, and the power-management issues of getting into a small form factor and providing more capability than we’ve ever had before,” Ng says. “It’s definitely a challenge.” (Figure 2.)
Figure 2 | For reduced-SWaP applications FLIR Systems engineers designed the rugged StormCaster-L, an ultra-low-light imaging payload with line-of-site stabilization and range of motion. It supports the SkyRanger R70 and R80D. SkyRaider platforms. Photo courtesy of FLIR Systems.
Modularity can also get a bit tricky, “because you can’t optimize for all of the parameters when you have a standard that you’re driving toward,” Ng adds. “But it’s something that’s really benefited the community and industry.”
Yet another challenge: The military has a fast design schedule, which often pushes designers to use COTS [commercial off-the-shelf] components and integrate those parts into sensor payloads to get to market faster.
New sensor capabilities
Sensor systems are rapidly heading in the direction of becoming versatile, software-defined systems with shared hardware.
“Multiple different applications like radar and SIGINT are using common processing boxes, with software personality modules dictating the application in real time,” Krishnan says. “Also, with the advent of artificial intelligence articles and machine learning, software is able to make important decisions in real time to adapt to the environment and deliver the capability needed. For example, a communications relay radio can now, on the fly, decide to switch to a different waveform when it detects jamming on a different waveform. Agility is the name of the game.”
Sensors are getting much smarter, with computing power at the edge. “Sensor payloads have more processing and a network stack enables protocols that allow you to kind of declare your own capabilities, but they’re also running neural networks for object detection and classification,” Ng says.
FLIR is at the forefront of doing this with sensors and enabling smarter capabilities.
Another new capability Ng is seeing emerge is sensor fusion, which “combines multiple sensors and sensor data – passing data between sensors and combining it – to get much more useful information than any individual sensor can provide,” he explains.
Edge networking plays a key role here because that’s where things are heading with smart sensors and AI. “Sensors need to be more intelligent – in terms of learning from past data and using analytics to produce results that would normally require a dedicated team to analyze and figure out … but now it’s all built in at the edge with the sensors,” Ng says.
One of the most surprising things recently has been the general acceptance of unmanned systems, “because it took a while to get folks acclimated,” Ng says. “But the adoption and use of neural networks and AI is also surprising because it’s gaining traction quickly. I’m pleased, because this capability is very valuable. In terms of designs, it’s just another natural evolution of sensors becoming smarter.”
Advances on the horizon
Within the next 10 years, expect to see smarter sensors, more autonomous capa-bilities, and even more analytics.
FLIR’s object detection and recognition will become even more accurate, Ng says, which will “allow unmanned systems’ sensors to do a lot more together to relieve the burden of the soldiers and provide new analytics and capabilities. Sensors, payloads, and systems will predict things like when they’re going to fail and also detect threats. They’ll be able to provide capabilities that aren’t there now. We’re just starting to scratch the surface on those capabilities.”
Reducing the cognitive workload is another trend for unmanned sensor payloads: “Unmanned systems are supposed to be a force multiplier, and we’re starting to get there,” Ng says. “Systems are smart enough to relieve the burden so soldiers can focus on other things or control more systems at the same time, doing missions together, and there’s more interoperation between ground and air assets to perform a task that the operator can direct.”
The industry can also expect to see UASs start to assume roles that are performed by wide-bodied aircraft today. “Airborne early warning, communications relay, and EW are a few examples of applications that will be done as well as by UAS, which will have the added benefit of not putting lives at risk,” Krishnan says.
The reduction in the size of electronics and the surge in the U.S. Department of Defense’s interest in and use of AI are really driving the art of the possible of what sensors can do on UASs, Krishnan points out.
“Machine-aided decision-making at the tactical edge will be required for the user base to keep up with the growing wealth of sensor payload data that is available,” Krishnan adds. “The ability to allow multiple sensors to collaborate in real time will change the paradigm of how UASs are used. It will allow ground operators to make quicker decisions based on fused information streams, rather than spending time taking multiple disparate data streams and piecing information together.”