Military Embedded Systems

Failure is not an option: the trends behind military test systems

Story

September 09, 2020

Emma Helfrich

Technology Editor

Military Embedded Systems

Failure is not an option: the trends behind military test systems

Before any military technology can be fielded, it must undergo a barrage of test and measurement (T&M) scenarios to ensure its operability. The importance of these evaluation systems and their criticality is simple yet profound: The lives of warfighters are dependent on them. In recent years, varying demands and challenges have pushed manufacturers to innovate. Military T&M systems are going digital, becoming software-defined, and are undergoing widespread standardization right alongside so much of the technology used by the U.S. Department of Defense (DoD) today. This inexorable march then prompts the question of what takes priority with the DoD: ensuring compatibility with reliable legacy systems or moving forward with the wave of standardization?

Industry officials might argue that the U.S. Department of Defense (DoD) desire to keep the military’s trusted legacy systems fielded without having to proliferate test systems unique to the capability has fostered a culture of adaptability. What seems to have thrown a wrench in the advancement of test and measurement (T&M) systems, however, has been a skewed consideration of longevity needs. But that isn’t to say it’s completely halted any progress.

If anything, maybe it’s a good problem for the DoD and technology manufacturers to have. Producing military assets that provide such a high level of longevity while maintaining and increasing current capabilities could not only result in long-term safety for the end user but also puts a premium on test systems that can adapt over time to changing capabilities or test requirements.

Due to this reality, companies in the industry are seeing a widespread push toward test systems that can be customized or evolved through software. Companies are being asked for systems able to do both the component-level test and the system-level test, all of which require a multitude of I/O, data storage, and streaming capabilities. With these huge asks, issues can arise in areas such as obsolescence management and technology-insertion challenges.

Maintaining backward compatibility

To avoid often-costly software upgrades, industry officials argue that backward compatibility in T&M systems is generally considered a necessity. DoD initiatives that are aimed at keeping legacy assets in the field do still have skin in the game even during budget shifts, but at the same time it’s also true that funding of interoperable test systems is being allocated elsewhere.

“With ensuring backward compatibility in T&M systems and keeping legacy systems in the field, I don’t think the DoD has the luxury of choosing one or the other,” says Nick Butler, chief marketing lead for aerospace and defense at NI (National Instruments – Austin, Texas). “And it’s challenging for them. They’re asked to take a budget and make it work across both new platforms and new-product introduction as well as maintaining decades-old assets.”

The DoD is tasked with maintaining that backward compatibility while looking for opportunities to standardize their equipment to make it useful across assets. Doing all of this at the same time as maintaining and performing technology insertion and handling life cycle management for dozens of legacy T&M systems can prove to be a challenge. Let’s just say the road to both backward compatibility and standardization of test systems is paved with good intentions.

“Ensuring backward compatibility leads to maintainability, which in turn provides ­reliability in the field. So the budget precedence is clear: backward compatibility,” says Dan Dunn, vice president and general manager of Aerospace Defense Government Solutions at Keysight Technologies (Santa Rosa, California). “To move toward a noncompatible piece of equipment within the T&M system leads to expensive software modifications, requiring the whole system to be revalidated. A certain amount of adaptability is required of both the equipment supplier and the user. Gaining true backward compatibility may require a certain degree of customization within the new instrument.”

Certifying that modern military test systems can both evolve alongside the threat environment and can also reach back to keep legacy systems fielded are the overarching goals for the DoD and manufacturers. Innovative methodologies to maintain this backward compatibility in T&M systems are starting at the software level, with standardization also playing a pivotal role.

Standardization symbiosis

A widely adopted idea across military-technology manufacturers is that standardization is the key to building efficiency. Groups and consortia formed around that objective aim to use standardization to “reduce the cost of test through economy of scale in procurement, common training, and improved maintainability,” Dunn says. “Having a broad range of capability intrinsic to the T&M hardware enables a software-­definability model where standardization and flexibility can coexist.”

There’s a potentially symbiotic relationship between the standardization of military assets and the standardization of test and measurement capabilities. As assets standardize, so too would test and evaluation requirements.

“From our perspective, we are seeing increasing standardization using a consolidated approach, such as what’s used in eCASS and VIPER/T [DoD test solutions],” says Tim Webb, director of U.S. defense sales at Astronics (East Aurora, New York). “We continue to implement strategies to allow for technology insertion as the next-generation approach. Standardization of capabilities would certainly bolster the efficiency of test equipment, especially for those with the ability to insert new technology and make adjustments for the next generation. These systems can help ensure seamless compatibility with any legacy equipment that the DoD wishes to maintain, especially existing TPSs.” (Figure 1)

[Figure 1 | Astronics’ ATS-6100 wire fault tester combines low-energy, high-voltage, and spread-spectrum time domain reflectometer technologies to locate both hard and soft faults in wiring and insulation.]

The methods through which respective companies choose to reach widespread compatibility of T&M systems, whether consolidation or standardization efforts, both aim to achieve a proliferation of capabilities in defense and military organizations. Moreover, the driving force for test organizations to combat the challenges presented by these new capabilities has conferred a premium on software and on developing more software-defined test systems overall.

Engineering T&M systems to be software-defined

Customers – and, no doubt, the DoD – see a lot of value in software-defined systems that can evolve or be customized over time to meet the requirements of new and old programs alike, Not only does it save money, but it also makes for quicker and more efficient technology refresh (Figure 2).

[Figure 2 | Keysight Technology’s M980xA PXIe vector network analyzer supports signal analysis capabilities with vector signal analysis (VSA) software interoperability.]

“That trend around incorporating ad­vanced capabilities and specifically a trend around more autonomous and cognitive systems puts a real challenge on test,” Butler says. “And it puts a premium on software-connected test systems that can be customized and that can adapt over time to changing capabilities or changing test requirements. Our customers can’t go out and build an entirely new test systems every time their capabilities increase, or their requirements change.”

T&M systems and mission-critical software often converge: The software that’s running the aircraft subsystem or the software that’s helping drive the missile-navigation system, for example, has to be extensively tested, validated, and run through varying simulations to make sure it works properly in all operational scenarios. But the test system itself needs its own software to execute the test and evaluation in the first place, which can create security risks.

The solution used in the past to safeguard T&M data, Butler says, was to airgap the systems, a network-security measure used to ensure the security of a computer or network by physically isolating it from unsecured networks, such as the public internet or an unsecured local area network. However, the recent industry push toward digital transformation – connecting and networking systems across an enterprise with the aim of improving access to data – although widely discussed and adopted, means that manufacturers can no longer airgap these systems. This new, networked reality ostensibly makes it easier for an adversary to access test data and therefore boosts the importance of encryption during both data transfer and storage.

“Measurement data must be secured, and so should measurement setup information, user-specific calibration data, information about the end user, and equipment usage data,” Dunn says. “In a nonsecure use model, this data would be kept on nonvolatile memory (NVM) inside the instrument or transferred to a larger data repository on a server that may be cloud-based. However, for the secure environment cloud-based data storage, this is not acceptable and storage inside an instrument may be problematic, even if it resides in a secure lab. If the instrument must ever leave the lab for repair or calibration, the data must stay behind. So, the solution is that the equipment must have either a removable NVM for hosting sensitive information or an option to prevent a customer from writing anything sensitive to NVM.”

Organizations seem to agree that customers want the ability to have control over what data is stored in volatile versus nonvolatile memory. Manufacturers are being asked to design test systems that give users the ability to choose what goes into nonvolatile storage while retaining the ability to wipe that nonvolatile storage completely before reuse or transfer of the test system.

Automating and digitizing test and evaluation

To achieve this level of control over what T&M data is being stored and where, an industrywide shift toward automation and digital engineering is underway. From preproduction to postproduction, test data needs to be a critical component of that emerging digital product thread.

“There is a transformation underway, thanks to the digitization of manufacturing, sometimes termed Industry 4.0,” Dunn says. “This enables customization of products under high-mix, low-volume production with improved efficiency. This type of production and test process is well-suited for military systems from radar to SATCOM. The move is away from highly integrated monolithic test systems to a more distributed approach for these types of systems under test (SUT) and increased use of robotics. Artificial intelligence (AI) and machine learning capabilities will play a large role in dramatically improving uptime and utilization of test assets while reducing overall manufacturing costs. Interaction between machines and T&M equipment enables prediction of failures and triggers rework processes autonomously, as well as the ability to react to unexpected changes in production.”

The proliferation of automation and predictive maintenance in test, monitoring, and manufacturing continues to emphasize the importance of software in T&M systems because the algorithm is what automates them (Figure 3). End users continue to seek out data and systems management and asset monitoring systems that include predictive maintenance algorithms and these digital engineering initiatives are designed to simplify the acquisition process for them.

[Figure 3 | National Instruments’ VeriStand software is designed to configure real-time I/O, create plug-ins, import and simulate models, and automate real-time tests.]

“The U.S. Air Force has a lot of digital engin­eering initiatives and goals for the short term, and we’re working very closely with them to understand what role we can play in helping them set up digital twins and perform digital simulation, and we’re using model-based engineering to create a complete digital product thread of information about the asset or product they’re creating,” Butler says.

Programming autonomy into a test system is a welcome advance in a field where the push for software definability and digitization is so prevalent. A machine learning algorithm in a T&M system could in fact cut technology insertion costs, maintain backward compatibility, and limit workload for the warfighter.

“We believe autonomy will absolutely play a role in future test and measurement equipment as AI-powered capabilities become embedded in the systems more and more,” Webb says. “The better we get at monitoring and collecting meaningful data and predicting failures, the more reliable an end product will become.”

Featured Companies

U.S. Department of Defense (DoD)

1400 Defense Pentagon
Washington, DC 20301-1400

NI (National Instruments)

11500 N Mopac Expwy
Austin, TX 78759-3504

Keysight Technologies

1400 Fountaingrove Parkway
Santa Rosa, CA 95403

Astronics Test Systems

130 Commerce Way
East Aurora, New York 14052
Categories
Comms - Test
Topic Tags