Military Embedded Systems

Exploiting big data for defense

Story

August 04, 2021

Emma Helfrich

Technology Editor

Military Embedded Systems

On the battlefield, time can be a dependable driver and a formidable adversary. For this reason, companies supplying military electronics are viewing the services’ all-encompassing decision-making processes through a data-powered lens in hopes of giving the military one thing: Actionable information in the shortest possible time frame.

The “big data problem” – too much raw data coming too fast over too many channels – remains sizable in both the defense arena and in the commercial realm, but currently entities from both sides are making consistent efforts to chip away at it. Whether a company manufactures technologies for the beginning or the end points of data-gathering and processing, trends driven by user needs seem to be headed in a similar direction.

When looking at the front end of a battle-management system, edge-processing advancements are spearheading the way that the U.S. military collects and processes sensor data to then relay anything of importance to ground-control stations. At the end of the data line, innovations in embedded electronics are keeping data at rest transportable and secure.

Industry officials agree that it will be astuteness at not only these start and end points, but also at all the stops raw data must take in between, that guides the direction of data’s exploitability in the Department of Defense (DoD). Ultimately, data-centric frameworks like Joint All Domain Command and Control (JADC2) and internal pushes for artificial intelligence (AI)-powered systems led by the Joint Artificial Intelligence Center (JAIC) will be dependent on it.

The first step in making any sense of the momentous amounts of data at the DoD’s disposal is the implementation of quick and reliable processors. Whether these processors exist in the field on the edge or in a data center for analysis, the big data problem may keep getting bigger without them and the promise of cross-domain warfighting could continue to see delays.

Processing at the speed of relevance

Evolutions in the speed of field-programmable gate array (FPGA), graphics-processing unit (GPU), and data-processing unit (DPU) technologies are ongoing, but in fact it is technological advancements in edge processing – at the edge of where data is collected – that could most dramatically change how the U.S. military uses data to its advantage.

“In the kind of conflicts and the kind of scenarios that the national defense strategy is really driving us to prepare for, what I think then becomes more critical is how you then take that data at an edge sensor where it was collected and ask how we can do as much processing as possible on the edge,” says Bill Conley, chief technology officer at Mercury Systems (Andover, Massachusetts). “So, instead of sending back raw data, you’re sending back insights, you’re sending back information that is much more distilled. And that dramatically reduces the dependency on the communications network that is going to be required for sending it.”

Observe, orient, decide, and attack: These are the words that make up the acronym known in the DoD as the OODA loop. It is within this loop that warfighters use data to enable not only educated decision-making, but timely orientation as well. Without robust data processing at the edge or otherwise, that time can quickly run out. (Figure 1.)

[Figure 1 | A sailor monitors data units aboard the USS Dwight D. Eisenhower in the Atlantic Ocean. U.S. Department of Defense photo/Navy Petty Officer 3rd Class Gian Prabhudas.]

“The question becomes what to do when you get a piece of data that doesn’t jibe when you go through your orient phase, and what we can do when we go through all of our other intelligence assets and analyst capabilities to make sure that we understand the meaning of that data,” Conley notes. “And in that case, having the ability onboard in an edge platform to do a high-fidelity record storage of that sensor data and bring back that data at the highest possible fidelity to allow those insights to be mined out of it will ultimately allow, at some future point, a commander to make better decisions in that orient phase.”

It can be argued that over the last couple of decades, the U.S. military has been fortunate enough to conduct its operations in relatively permissive environments. These sufficiently supportive environments enabled the armed forces to both capture raw data and have access to an adequate network for transfer and analysis. However, the DoD is preparing for this situation to change by researching and developing new technologies to mitigate processing challenges brought on by insufficient networks.

“There have been some exciting new advances that increase the capacity of computing devices to process big data,” says Curt Wu, chief software engineer at Charles River Analytics (Cambridge, Massachusetts). “One is the emergence of highly parallel computational fabrics that are designed to efficiently process wide streams of data without the bottlenecks of a traditional CPU or CPU/GPU architecture. The second is the development of processors supporting lower-precision arithmetic, saving both time and energy.”

With the goal of the DoD’s JADC2 framework being the cross-domain connection of sensors from all military services, efficient transfer of data will be just as critical as the processing of it. For that reason, tackling a concept as complex as a data-driven communications network shared by each military branch has inspired a multitude of modernization efforts.

Powering JADC2 with data

Passing data across domains will be a huge catalyst for the success of JADC2 success; this framework will ask a lot of communications, sensor, and processing systems throughout its implementation within the DoD. Legacy systems will be the first to see the effects of this modernization effort, simply because of the sheer amount of complex data the platforms will have to take on.

“Legacy systems are typically difficult and expensive to upgrade, so the best way to enable efficient transfer of data across these systems is often to proactively reduce the demand on the network,” Wu says. “Intelligent data compression reduces the overhead of data transmission while retaining the information that is necessary for the task. For this to be effective, it is critical to understand mission-oriented data requirements.”

As another possible solution, standards for interoperability are being discussed to better support the data-centric requirements needed for JADC2. What the military is calling a common data fabric could help define a set of standards services to better carry out JADC2 ambitions.

“It appears that we are going to end up with a collection of federated systems through the different services that are brought together to form the overall JADC2 framework,” Conley says. “With that in mind, I don’t know if it’s reasonable to expect that we will have one single standard for interoperability, but my hunch is that we may end up with a couple different standards and a couple of different tailored cross-domain solutions that will allow us to ensure that we can gateway data between the different environments as required.”

The faster network speeds and rapid gathering of sensor data soon to be enabled by JADC2 efforts will put pressure on companies that specialize in data encryption and storage technologies. Industry pushes for NVMe-type memory and National Security Agency (NSA)-approved solutions are consequently influencing design in this area.

“The challenge that I think we have when viewing JADC2 from a big data side is the ability to say all sensor data must be resident at every security level so that it can be run through some sort of processing or fusion engine,” Conley says. “That can make your big data problem substantially harder if you have to hold it at every security level. The ability to stream data into a necessary processor at the right security level and have those insights come out the far side and move that up and down the chain of command as required to ensure that the right people are getting access to the data that is required is going to be critical.”

Encryption and data at rest

While JADC2 is aiming to streamline the sharing of data across domains, vast amounts of it will remain unanalyzed and cached for later inspection. Where raw data is stored and how it’s kept secure when not in use is a facet of the big data problem that military electronics manufacturers are currently tackling.

“A lot of the data-at-rest side and the security side is largely driven by that capability to collect something during a mission that was unexpected, insightful, and interesting,” Conley says. “And to then have the ability to analyze it offline after the mission, and to be able to bring that back as a recorded file because you may not have access to a large communications network that would allow you to instantaneously stream it. And I think that’s really going to drive the security and the data at rest side for a tactical platform at the edge.”

Recent government mandates that call for the use of NSA-approved encryption devices in DoD initiatives and programs have also increased the demand for data-storage manufacturers to offer more robust storage and security options.

“The faster network speeds, an increased number of sensors, and the amount of data being collected from the sensors requires faster networks and larger storage capacity. That’s why there has been the advent of these quicker network speeds, NVMe technology, and then the removable memory cartridges that our devices have that use encryption to secure the data,” says Steve Petric, senior product manager for data storage solutions at Curtiss-Wright (Davidson, North Carolina). (Figure 2). “And we’re using not only encryption, but certified encryption. Using CSfC or Type 1 certification for encryption allows us to meet NSA-approved solutions for that purpose and they can transport that data to an offload station and analyze it.”

[Figure 2 | The DTS1 is a small, rugged NAS [network-attached storage] device with two CSfC Component Listed layers of encryption that is used to store large amounts of classified data on manned and unmanned vehicles. Curtiss-Wright photo.]

Commercial Solutions for Classified (CSfC) and the Type 1 certification programs were started by the NSA: Type 1 is a government off-the-shelf program typically used only by governing entities in the U.S., while CSfC encryption is a two-layer security program used most often by commercial industries. Both are touted by electronics manufacturers as reliable methods for securing data on the move and data at rest.

“The ability to transfer huge amounts of data with removable cartridges that are either single-encrypted with Type 1 or double-encrypted with CSfC, and to be able to move a minimum now of 8 TB and upwards of 64 or greater TB, that could facilitate the transfer of data back to the ground station for post-mission analysis,” says Paul Davis, former director of product management at Curtiss-Wright. “There they can do some more high-level big data and AI-related analysis to look for not just tactical information but maybe even strategic information.”

The difference between raw data and the actionable information that it can become is critical to its application on the battlefield and to the DoD in general.

The future of democratized data

Data presents a wealth of opportunity for not only the military but also for technology progress in general, as it has become clear that the symbiotic relationship between data and AI is enabled by the shareability of it. Transfer learning, for example – which can be defined as the application of knowledge gained from completing one task to help solve a different, but related, problem – has enabled data collected in the consumer space to be implemented in autonomy engines used in defense.

“Using large amounts of data to drive learning systems is having huge impacts in the commercial space, so there is a lot of interest in seeing how these techniques can be used for national defense,” says Scott Neal Reilly, senior vice president and principal scientist at Charles River Analytics. “There is clearly a lot of interest in AI and machine learning broadly, and deep learning more specifically, by countries like China; the U.S. doesn’t want to fall behind. Steps like standing up the Joint Artificial Intelligence Center (JAIC) and the DARPA AI Next campaign are two examples of how the DoD is trying to figure out how to stay ahead and how to leverage the current state of the art.”

These advancements in processing, cross-domain data sharing, and encryption are all operating under the acknowledgement that widespread access to data will be necessary for the military to, as Conley describes it, move at the “speed of relevance.”

“I agree that if we broadly democratize access to data, we are opening up a lot of security issues,” Conley says. “But at the same time, if we want to realize the sort of vision that the Air Force has laid out in terms of digital engineering, we need to end up with the flow of data and models going from the government to industry, but also vice versa. And that isn’t just a relationship between the government and a prime [contractor], but is a deeper set of relationships through the supply chain. All of these have to come together if we want to move at the speed of relevance that I think the DoD realizes that it has to move today.”

Featured Companies

Mercury Systems

50 Minuteman Road
Andover, Massachusetts 01810

Curtiss-Wright

20130 Lakeview Center Plaza
Ashburn, Virginia 20147

Charles River Analytics

625 Mt. Auburn St.
Cambridge, MA 02138