Military Embedded Systems

AI and the OODA loop: How AI enhances strategic decisions for today’s warfighters


June 21, 2024

Tim Stewart


New technologies, tactics, techniques, and procedures are becoming increasingly vital on the battlefield. The U.S. military is acutely aware of this and is integrating lessons learned from recent military operations in the Ukraine, as well as other conflicts, into the Combined Joint All Domain Command and Control (CJADC2), a warfighting concept that has the U.S. Department of Defense (DoD) seeking to achieve decision superiority.

Decision superiority is crucial in winning a battle: It strengthens the warfighter’s capability to “sense, make sense of, and act” (SMSA, in U.S. Department of Defense [DoD] parlance) at all levels and phases of war, across all domains, and with allied partners to swiftly deliver critical information. To achieve this warfighting concept to its fullest potential, there must be a clear path towards attaining it. Artificial intelligence (AI) is that path. AI enables decision superiority by helping to greatly reduce certain risks found in the extremely complex scenarios of warfighting.

Analyzing risk in today’s military operations

Knowing how to identify, analyze, and address critical challenges and risks, as well as develop potential technical solutions, is important in today’s military operations. This is especially true since most breaching assets will likely be conducted using unmanned or optionally manned systems. Being able to recognize, then utilize, these factors as a strategic asset will help to advance the CJADC2 initiative and improve decision superiority across military missions.

In pursuit of these goals, large amounts of data will need to be secured and transmitted across tactical networks to synchronize reconnaissance, security, and logistics as well as maneuver vehicles, fire on the enemy, and assist with other warfighting functions. Overall, CJADC2 aims to unify the massive quantities of data that different military divisions gather across a broad range of distributed sensors, and enable it to be processed into actionable information.

Those with strategic, operational, and tactical interests in the mission must use the most relevant information they receive to execute their missions. Held together by a robust set of communication and data sharing initiatives, CJADC2 aims to get the right information into the right hands, enabling organizations to achieve enhanced effects in their specific areas of responsibility.

How data helps with the risk

To apply and assess technology in the decision-making process, a model is required. The OODA loop – Observe, Orient, Decide, Act – is well-known and widely accepted, developed by U.S. Air Force Colonel John Boyd. This four-step process is followed closely in the warfighter community today.

  1. Observe your environment and gather information.
  2. Orient yourself to that information by analyzing it and understanding its meaning.
  3. Decide on what action to take based on your observations and orientation.
  4. Act on your decision.

The OODA loop emphasizes the importance of speed and agility in decision-making and action-taking. The goal is to complete the loop as quickly and efficiently as possible so that you can adapt to changing circumstances and take advantage of opportunities as they arise. (Figure 1.)

[Figure 1 ǀ Applying AI at the edge (AIAE) to the OODA loop speeds decision-making capabilities for improved decision superiority.]

Note that the OODA loop and CJADC2’s SMSA cycle share similarities in decision-making. Both emphasize rapid information processing, adaptation, and action. OODA’s observation and orientation mirror CJADC2’s “sense,” focusing on intelligence-gathering and understanding. OODA’s decision and action align with CJADC2’s “make sense of” and “act,” where decisions are based on analyzed data and executed with agility. Both prioritize agility and adaptability in decision-making to maintain superiority in contested domains.

AI’s role in the OODA loop

In the observe phase, AI-powered sensors, drones, satellites, and surveillance systems collect vast amounts of data from the battlefield. These data streams provide commanders with real-time situational awareness, offering insights into enemy movements, terrain conditions, and friendly-force positions. AI algorithms process this data, filtering out noise, identifying relevant patterns, and detecting potential threats, enabling commanders to rapidly assess the evolving situation.

During the orient phase, AI processing facilitates the analysis and synthesis of the gathered information. Machine learning algorithms analyze historical data, identify trends, and generate predictive models to anticipate enemy intentions and behavior. By leveraging AI-driven analytics, commanders gain a deeper understanding of the operational environment, enabling them to formulate effective strategies and courses of action to achieve mission objectives.

In the decide phase, AI algorithms provide decision support by evaluating multiple courses of action and assessing their potential outcomes. By simulating different scenarios and conducting probabilistic risk assessments, AI processing assists commanders in making informed decisions under uncertainty. These decision support tools enable commanders to weigh the risks and benefits of various options and select the most optimal course of action to achieve tactical or strategic goals.

In the act phase, AI processing facilitates the execution of chosen courses of action, enhancing the speed, precision, and coordination of military operations. Autonomous systems, guided by AI algorithms, perform tasks such as targeting, logistics, and maneuvering with minimal human intervention. By automating routine tasks and optimizing resource allocation, AI-driven systems enable military forces to maintain operational tempo and exploit fleeting opportunities on the battlefield.

Edge processing for more actionable data

One of the most important considerations is that decisions must be made at the edge, or not remotely from a command center. Moving to the edge greatly reduces latency and enhances flexibility in a rapidly changing environment, thereby making the war­fighter nimble and ready for anything.

Having AI at the edge (AIAE) enables the deployment of AI algorithms on devices physically close to the data source. Connecting sensors directly via AIAE greatly reduces latency between the observe-orient steps in the OODA loop, as noted above, for example.

AIAE processing and decision-making markedly reduces latency between the orient-decide steps as well, since there is no need to transmit large amounts of data for additional decision-making steps to the external center and then wait for the decision to be sent back. Sending the “act” command using an AIAE infrastructure will reduce latency for the decide-act steps for the same reasons.

Enabling AI in military operations

Graphics processing units (GPUs) are designed to handle large amounts of data in parallel and are widely used for AI applications. Because of advances in technology, small-form-factor, higher-performance computers are being brought to market with the combination of general-purpose graphics processing units (GPGPUs) and central processing units (CPUs) that can be used for AIAE applications.

By leveraging the parallel processing power of GPUs, GPGPUs can accelerate a wide range of AI applications. (Figure 2.)

[Figure 2 ǀ A GPU’s parallel processing architecture enables faster computation than a CPU, facilitating the use of AI in military applications.]

What can be used?

The NVIDIA Jetson family of modules combine AI-capable GPGPUs with multicore CPUs, for a tightly coupled, high-performance, low-power supercomputer that can support AI processing capabilities and decision-making applications software. NVIDIA’s Jetson family of products has the option of several modules with different form factors, performance, and maximum power options. (Figure 3.)

[Figure 3 ǀ The NVIDIA Jetson family includes different modules with various form factors, performance, and maximum power options.]

The NVIDIA Jetson Xavier NX module provides six trillion floating points operations per second (TFLOPS) performance with a maximum power of 15 watts, performance comparable to a several-hundred-watt workstation with processor and GPU cards.

This kind of computing architecture can process and apply AI algorithms for 20-plus high-definition video inputs with 1040p resolution at a rate of 30 frames per second. It has enough bandwidth to run AI applications servicing multiple high-definition cameras in a system.

From the size, weight, and power (SWaP) plus performance perspective, a ruggedized NVIDIA Jetson Xavier NX module is suitable for use in AIAE applications because units can be as small as 4 inches by 2.3 inches by 3.9 inches, weigh as little as 1.3 pounds, and offer just 15 watts of power. (Figure 4.)

[Figure 4 ǀ Compact, high-performance AI supercomputers process vast amounts of sensor data at the edge.]

More demanding performance needs can use larger and higher-power rugged solutions based on higher-performance NVIDIA Jetson modules. These types of rugged GPGPU-based units can also support industry standard interfaces, including Ethernet (1 GbE and/or 10 GbE), CAN bus, serial ports, etc.). An Ethernet interface can be used as
a communication channel to other “smart”’ boxes and mission computers in systems. It can also be used for interactions with external equipment through wireless communications converters. If low-latency Ethernet communication is required, time-sensitive networking (TSN) or time-triggered Ethernet (TTE) can be used.

Using an Ethernet network for internal communications enables the user to implement multiple redundancy levels, from physical cables to routers and packets. Implementing IEEE 1588 time distribution protocol across the network enables all network elements to be synchronized to a single time source.

Besides high-speed sensor processing, GPGPU units can be used to process data coming from lower-speed sensors – analog I/O, discrete I/O, serial ports, etc. Combining these functions into one AIAE unit can help to eliminate additional electronics boxes and associated harnesses in a vehicle, further reducing the SWaP of electronics equipment.

AI is the advantage in decision superiority

For the DoD, achieving the CJADC2 vision begins with demanding that industry be able to connect every current sensor that can support battlespace awareness, as well as make sensor data available to any potential user, at any level of operation. This data-sharing construct can create secured battlespace awareness, wherein actions in one part of the single, integrated, global battlespace can be understood and inform actions and decisions required in other areas.

The integration of AI processing within the OODA loop framework has revolutionized military decision-making, pro-
viding commanders with unprecedented capabilities to adapt, react, and prevail in complex and contested environments.
By leveraging AI-driven technologies,
the U.S. military has enhanced its ability to maintain a competitive edge on the modern battlefield, ensuring superiority in an era defined by rapid technological innovation and strategic

Tim Stewart is Director, Business Development, at Aitech. He has 20 years of experience in high-technology hardware, software, and network products, with 11 years as a relationship executive managing requirements and challenges of companies seeking partnerships and critical corporate development. Tim holds a BS in mechanical engineering and physics from Boston University.


Featured Companies


19756 Prairie Street
Chatsworth, CA 91311