Military Embedded Systems

GUEST BLOG: How the military can speed data mobility for smart decisions on the move

Blog

February 03, 2023

Travis Steele

Red Hat

The military is producing ever-larger amounts of valuable data from embedded systems at the edge. Now it needs more effective ways to share, understand, and act on that data – at or near the user and as close as possible to real time. By leveraging available technologies like containerization and tools to manage them, warfighters have the opportunity to make faster, smarter decisions where and when they’re needed to help achieve battlespace dominance.

In the recent past, the U.S. achieved battlespace dominance through the technical superiority of its weapons. Going forward, U.S. forces will face more near-peer, if not actual peer, adversaries across multiple domains. Dominance will increasingly be attained through information advantage and the ability to gather, process, and act on data – and do it faster than the adversary.

To achieve that goal, data-driven analysis and decision-making must occur in real time, very close to where these decisions are needed. Even as more data flows in from mobile devices, cameras, sensors, satellites, unmanned equipment, artificial intelligence (AI) systems, and the like, mission operators still need ways to turn that data into in-sights and information that lead to the right actions. Just as important, they need to quickly share data across every domain – land, air, sea, space, and cyberspace.

Fortunately, data-mobilizing technologies exist today in the form of containerization based on microservices architectures, platforms to automatically configure applications and networks, and API interchanges to translate data to and from applications.

Quickly deploying applications

Most edge solutions face multiple limiting factors such as size, weight, and power (SWaP), as well as denied, disrupted, intermittent, and limited (D/DIL) communication. Often, the decrease in resources is accompanied by a proliferation of devices. Therefore, workloads need to be portable across a broad range of systems, from the source of the data all the way to the data center.

A microservices architecture can help address portability challenges while also improving capability at the mission-edge tiers. Microservices are modular components that together make up a robust software application. Such an approach enables the user to easily create and spin services up or down on demand. If an application needs to scale or update because of mission needs or changes, microservices help you rapidly respond.

Containerization is an effective way to employ your microservices architecture. Containers are lightweight and combine an application with the required frameworks and libraries that enable it to run on various platforms across the mission edge. This functionality is key when considering applications and systems that will support agile combat employment or agile ops, as it enables the user to effectively move applications and subsequent workloads around while ensuring consistency and interoperability across environments.

Given the proliferation of edge devices and the ability to use containers to run mission-critical workloads close to the warfighter, what happens to all these containers? That’s where Kubernetes comes in: Kubernetes is an open-source orchestration platform for managing containers at scale that can help automate, track, and schedule the deployment of containers to keep systems running uninterrupted. It can also be optimized in a lightweight format to meet the SWaP needs users in the field.

Automatically configuring networks and data storage

Applications are not the only technology consideration at the edge: Warfighting missions need to dynamically combine the networks and data storage of each domain, and those resources must be able to reconfigure rapidly and adapt to change as missions launch and evolve.

Such is the advantage of software-defined networks and data storage, as a software-defined approach enables continuous alteration of the infrastructure at the software layer without having to change the underlying hardware.

A software-defined approach requires time and technical expertise, but with an automation platform, warfighters can replace manual process steps and provide necessary expertise. Instead of a technician typing keyboard commands while under duress, specialized software can couple technical expertise with sequenced commands, and automatically reconfigure networks and data storage – resulting in operational adaptability.

Such automation enables resources to become “event-driven.” Has a new unit mobilized, thereby changing the understanding of the current and expected environment? The automation platform automatically integrates its network and data resources. What if a cyberattacker flooded a critical node with denial-of-service (DoS) traffic? The automation platform takes down the affected resource and deploys a new one.

Rapidly translating data for every domain

Rapid deployment of applications and networks is necessary for data mobility at the edge, but that’s not enough: Domains must also be able to quickly and easily share data.

The problem with this is that isolated and non-standards-based application development and data formats have plagued military service branches for years, which doesn’t only create interoperability challenges, it also means that at a technical level, there’s no awareness between applications and systems.

In the past, military teams addressed this issue in a centralized way, translating every mission data type into every other mission data type, a slow and resource-intensive process. Moreover, as data assets are added, the burden on systems only gets worse.

A modern approach is leveraging application programming interfaces (API). An API interchange uses a key to share data among various applications or sources without having to know anything about the applications themselves.

Using this approach, wherever data is produced – whether from an application, a sensor, or a satellite – the API enables access to information and to input data. Then, wherever the data is consumed – by an AI algorithm, an analytics solution, or a weapons system – it’s translated from one application to another. This process makes data discoverable between domains and provides secure access to understand and act on data from every other domain.

Speeding data mobility at military scale

These technologies can revolutionize data consumption and decision-making across the spectrum of mission scenarios, as they’re secure and scalable. Some elements, such as portable containers, have been used by industries – for instance, automotive, telecommunications, and financial services – for nearly a decade. Finally, they can be built in a way that’s compliant with guidance from Common Criteria, the National Institute of Standards and Technology (NIST), and Federal Information Processing Standards (FIPS).

Travis Steele is a chief architect for Red Hat. He has more than two decades of expertise in technology leadership and transformation, with a focus on IT operations, infrastructure and applications, cloud delivery, and cybersecurity.

Red Hat · https://www.redhat.com/en

Featured Companies

Red Hat

100 East Davie Street
Raleigh, NC 27601