Military Embedded Systems

The case for containerization at the tactical edge

Story

June 10, 2019

David Gregory

PacStar

At the tactical edge – let’s define it as the platforms, sites, and personnel operating at lethal risk in a battlespace or crisis environment – one frequent challenge is how to create or host a service/application that can make the difference between life and death for the warfighter. Applications that deliver situational awareness or critical communications are essential to mission success, yet often must perform in environments with limited connectivity and transport options. Containerization applications can enable users at the tactical edge to dynamically deploy, manage, and secure critical applications.

It could be argued the concept of “container-like” application separation has existed nearly 40 years; however, like many technologies, the benefits were likely not fully appreciated at the outset. In the case of container popularity, it’s really only been within the last 10 years or so that portable containers can be effectively implemented and supported across varying levels of infrastructure.

Containerization is not a novel concept for the U.S. Department of Defense (DoD): Container frameworks enable application developers, data managers, and end users to break down silos and increase the speed and agility with which new solutions can be deployed and existing IT can be modernized, whether for DoD’s broader network modernization efforts or at the tactical edge in battlefield environments.

Government agencies need to modernize legacy application architectures quickly and cost effectively while remaining within compliance and regulatory requirements. The Docker engine is an open source containerization technology combined with a workflow for building and containerizing applications. Containerization like that done with Docker holds the potential for agencies to deploy scalable services, improve security operations, and enhance application reliability – on a wide variety of platforms – to maximize existing capacity, improve flexibility, and manage costs.

Adoption challenges

Low adoption rates of application container technology among government agencies can be traced in part to low awareness of what application containers are, the burdens of training, and how to inject container development into traditional application development environments.

It is a common misconception that application containers are a one-for-one replacement for the tried-and-true virtual hardware machine solutions. To summarize – yet not to oversimplify – containers wrap a piece of software in a complete file system that contains everything needed to run. The resulting containers ride on top of a shared operating system and can work hand-in-hand with virtual hardware solutions.

The Docker program performs operating-system-level virtualization for creating, deploying, and running application containers. The benefit to the DoD is the potential to rapidly innovate, develop, deploy, and manage application-based solutions while saving money.

Training – yet another technology

One of the largest and most significant barriers to adopt containerization in the DoD is that it is yet another technology that requires training.

In general, containers isolate software from its surroundings; however, all applications still depend on the underlying IT infrastructure to function (i.e. routing, switching, processing, storage). Broadly put, basic IT systems continue to increase in complexity and drive training requirements. We all know that hyperconverged infrastructures, 100 Gb networks, and petabyte storage systems are finding a firm spot in the “systems of systems” intertwined today.

It is often necessary to train soldiers to maintain often outdated applications, infrastructure, and security architectures. Additionally, training for computer science, application development, development operations (DevOps), and security operations (SecOps) must be focused on fundamentals, due to time and budget constraints, leaving little to no room for training on emerging solutions.

Not a bolt-on technology

Docker is not necessarily something that intuitively bolts onto existing development environments or complies with existing security tactics, techniques, or practices.

All the application dependencies are accounted for when building and configuring Docker containers. Doing so eliminates many, but not quite all of, system environmental dependencies. As such, much of the former system dependencies are managed at the development level instead of the sysadmin level. Additionally, as of this moment, there is no Security Technical Implementation Guide (STIG) nor any commonly accepted DoD security best practices for Docker or container technologies in general.

Automation and orchestration

A powerful feature of modern container technologies is multihost cluster orchestration: automated deployment operations, auto-scaling up and down, self-healing functions, auto-placement, load balancing, etc. Granular container automation has a substantial advantage over virtual machine (VM)-based app deployments. Where VMs may take a few minutes to launch with a hosted application, containers can deploy in seconds. However, with this level of automation comes increased complexity and troubleshooting challenges. The configurable features are vast and can include application-specific settings. As such, more specialized training is needed to effectively develop and deploy containerized applications.

Given the tech literacy gap and lack of familiarity with this solution, there is not ­currently a no-cost or low-cost path for government adoption of a containerized application like Docker. However, it remains an option with enormous potential for the required investment level.

Benefits of containers

Despite the challenges outlined above, expanded adoption of Docker could be transformational in improving mission capabilities in DoD. Working with Docker containers enables developers to bundle up an application with all necessary dependencies – such as system tools, system libraries, and settings – as one complete package making them portable across multiple network environments. Containers effectively isolate software from the surrounding environments, and if embraced by agencies for application development, Docker offers an ecosystem that can unlock greater containerization efficiencies and benefits.

Reduced SWaP – think tactical

Historically, tactical communications equipment has relied on a number of variables that make it bulkier, heavier and less versatile than what agile warfighters demand. For example, it’s not uncommon to have separated hardware dedicated to individual functions. Containerized apps realistically require a lot less resource (like memory and CPU) relative to traditional virtualized hardware app hosting options, meaning that an optimized container solution could dramatically reduce the physical app footprint. Additionally, well-­constructed containers are portable across a myriad of hardware platforms, to include lightweight small-form-factor computers and Arm-processor based IoT devices. For that reason, size, weight, and power (SWaP) can be reduced and/or more apps can run on the same existing hardware: hence, doing more with less.

Automation and orchestration

Docker comes equipped with automation and orchestration capability – called Docker Swarm, or just Swarm – that can be enabled to provide highly available apps with self-healing, auto-scaling, auto-replication, auto-distribution, and load-balancing services across disparate network architectures. A well-­configured Swarm can orchestrate the auto-­placement of application instances to respond to increasing service requests, or a node outage, without human interaction. Additionally, in many circumstances Swarm components can be added or removed dynamically in production without requiring a maintenance window or disrupting services.

High availability for tactical applications

The modern warfighter needs modern battlefield technology that is secure, small, quick, effective, and resilient. Deploying applications needed for life-or-death situational awareness is difficult, as resources are scarce in tactical environments. Processing is a premium at the tactical edge and real limitations exist. It is not always feasible – and certainly not desirable – to simply add hardware and software (increasing SWaP) for high availability. (Figure 1.) An integrated container ecosystem may be able to provide some level of application resiliency even with limited processing and communication transports.

Figure 1 | Real limitations exist when processing data at the technical edge. Users need battlefield technology that is secure, small, and often mobile.

21

   

Users can now take advantage of a growing library of well-supported 3rd-party software vendors offering configurable prepackaged containers; software-defined networking (SDN), storage convergence, distributed file systems, and security monitoring are a few key capabilities joining the ranks. These containerized apps, along with more automation and configuration ability, provides programmers more options to develop effective applications.

It is feasible with a well-configured Swarm – given the SWaP advantages of containers combined with the automation, portability, and orchestration abilities – that a containerization application like Docker could provide a highly available multi-app hosting ecosystem across dissimilar hardware, offering multi-instant load balanced access and resiliency.

Broader awareness of the versatility of Docker and use cases will spur broader adoption: Once this is done, the ability to innovate quickly and develop relevant applications inside that environment will increase dramatically. Containerization should also reduce costs, beginning with application development, application portability, system management, and system versatility. Less time should be spent customizing applications for specific devices and IT infrastructure and more time spent implementing features.

Potential use cases

At the tactical edge, containerization technology is optimal as it could enable end users to dynamically deploy, manage, and secure critical applications, all built-in assurances of availability.

Military users could use containerization to “auto-magically” deploy critical apps across any available processing asset with capacity and connectivity. Data stores could be geographically distributed and take advantage of software-defined storage strategies and convergence. Distributed application services could be pushed down to end-user devices. Entire systems could be rapidly deployed and deconstructed for individual missions, temporary operational support, or even anonymous intelligence collection.

David Gregory is director, strategic initiatives, at PacStar. He previously served as director of research and development at Intelligent Waves. David earned his MS degree in computer science/digital design from George Mason University and his BS in electrical engineering from University of Wyoming College of Engineering and Applied Science.

 

Featured Companies

PacStar

15055 SW Sequoia Pkwy. #100
Portland, OR 97224