Leveraging AI for testing military cognitive systemsStory
September 06, 2023
Designers of test and measurement systems for military applications such as radar and electronic warfare (EW) are stepping up to use artificial intelligence (AI) solutions to enable better testing of cognitive functionality. Meanwhile, modern digital architecture adoption is driving military test requirements.
Artificial intelligence (AI) and machine learning (ML) tools are finding their way into nearly every area of defense systems, from manufacturing, radar system development, and avionics to software development and test and measurement systems.
“Not only is AI impacting the capability of test systems themselves, AI is also having an impact on how we test,” says Jeremy Twaits, Solutions Marketing Manager, Research & Prototyping for Aerospace, Defense & Government, at NI (Austin, Texas). “AI is making systems more adaptable, with behaviors changing based on training data sets. With AI, engineers must know the bounds of the system’s performance and use test methods that meet the most critical and likely scenarios the system may encounter when deployed.
AI tools can also enable cognitive functionality in electronic warfare systems. At Rohde & Schwarz, the company is “helping customers deliver AI/ML-enabled systems by equipping them with tools to deliver high-bandwidth, long-duration RF record and playback systems that are used to train cognitive systems in operationally relevant RF environments,” says Tim Fountain, Global Market Segment Manager, radar and EW [electronic warfare], Aerospace & Defense Market Segment, Rohde & Schwarz (Columbia, Maryland).
“Also, cognitive systems are being use to extract and classify novel emitters in wideband captured data from ELINT [electronics intelligence] receivers,” he continues. “One challenge that our customers repeatedly tell us is that they have no shortage of data from collect activities, but labeling, classifying, sorting, and geolocating those signals is still a manual task that is often overlooked by analysts due to time and budgetary pressures.”
The amount of data being demanded from military users is only increasing, putting more pressure on system designers and those that test those systems.
“With the advancement of high-speed capture technology, the amount of data we’re able to collect is increasing at an exponential rate,” says Greg Patschke, General Manager of Keysight’s Aerospace/Defense and Government Solutions Group (Santa Clara, California). “These large data sets bring the challenge of analyzing the information and formulating results. Currently we are using unsupervised machine learning tools to accelerate the path to insight. We can use intelligent algorithms to identify signals of interest, classify and categorize information, and recognize patterns and anomalies in the data. Utilizing this technology opens the door to a whole new world of data analysis that previously was not feasible.”
Defining scenarios for testing while also enabling adaptability through AI systems will be critical due to the complexity of the systems.
“It is near impossible to test in every possible scenario, but the industry must define critical scenarios and models to test,” Twaits notes. “Due to the dynamic nature and challenge of truly testing and trusting an AI system, a test platform must have adaptability built in to address future test scenarios and requirements. As an example, COTS [commercial off-the-shelf] hardware from NI can link with software tools from MathWorks, such as Deep Learning Toolbox. NI and MathWorks have collaboratively demonstrated how software-defined radios (SDRs) can be utilized for over-the-air testing and assessment of trained neural networks for classifying radar and 5G New Radio signals.”
Defining test functionality in software
Contributing to the use of AI in test solutions is having the capability to place test and measurement system functionality in software.
“In the test and measurement industry there is a constant need to improve the functionality of measurement software,” Patschke says. “The specialized nature of EW testing often requires a level of software innovation and flexibility that we typically don’t see in other industries. For instance, angle of arrival (AOA) testing in relation to radar/EW requires the seamless pairing of both software and hardware to appropriately apply real time kinematics and accurately calculate AOA results.
“A few years ago, this functionality did not exist in [testing] software, but as customer requirements and demands have changed, companies like Keysight have adapted to meet these needs,” he continues. “Customers require systems that possess the flexibility to meet their needs as new challenges surface. The only way to meet these needs at the rate in which they arise is to continue to upgrade our software as much as we possibly can with new capabilities, so that hardware can be continuously repurposed for multiple uses.”
Demand for standardization and fast turnarounds is also requiring more software functionality.
“The most pressing issue that customers tell R&S is that they need a quick, verifiable, and repeatable measurements, often based around a standard,” Fountain says. “Customers often do not have the time or in-house expertise to develop that specific measurement, so may rely on a vendor to have that measurement available as an add-on, or in some cases to use de facto industry toolsets such as Matlab and/or Simlink to support quick s/w functionality, especially as FPGAs [field-programmable gate arrays] and GPUs [graphics-processing units] become more prevalent in the measurement data flow.” (Figure 1.)
[Figure 1 ǀ Rohde & Schwarz offers the integrated record, analysis, and playback system (IRAPS). IRAPS can be used for both laboratory and range RF recording and playback applications that require wide bandwidth, long duration RF record, and and playback, such as radar test and range electronic warfare (EW) effectiveness evaluation.]
Defining test systems in software is “exemplified by the trend across the aerospace industry often called model-based system engineering,” says Haydn Nelson, Business Development Manager for Radar/EW, NI. “The push for standardization of system-level models and requirements makes software integral to the definition of automated test systems.
“For radar and EW, this is challenging due to the multimission nature of radar and the secretive nature of EW,” Nelson continues. Defining, developing, evaluating, and deploying new methods and techniques is a complex process. As threats evolve, users want new systems faster, and test and evaluation processes cannot stand in the way of this. Software-defined test systems are critical to matching the pace requirements while preserving the sensitivity of system capabilities and performance.
Demand for more laboratory testing is also driving software-defined test systems: “A specific request we see is to be able to test more in the lab in realistic ways, without the challenge of fixed and locked test systems,” Nelson says. “The more one can test before open-air-range testing, the higher the confidence that a new method or technique will be trusted by the end users. Sharing data and proving capabilities is as important as developing the capability itself.”
Complex adversarial threats across multiple domains are driving the performance requirement of radar and EW systems, thus putting more pressure on test-system designers to deliver accurate and efficient solutions.
“In general, the trend is towards ever-higher measurement accuracy and lower phase noise,” Fountain says. “Accuracy and phase noise directly relates to the ability characterize the performance of the radar. On the EW side, we are seeing a move towards higher-fidelity simulation of highly complex electromagnetic scenarios, driven by congested and contested operational environments.”
The digital architecture requirements and modernization efforts of radar and EW systems also demand multifunctionality in test systems.
“At a high level, the requirements for test and evaluation are driven by the adoption of modern digital architectures that require functional, parametric, and system-level test in a single system, as well as ways to segment digital and RF systems for independent test,” says NI’s Twaits. “Additionally, many legacy radar and EW systems are undergoing modernization initiatives for which legacy test platforms are too inflexible to meet the test requirements for new system capabilities. Modernization doesn’t bring infinite budgets for test. New systems and upgrades are constantly balancing the constraints imposed by budget and time-delivery pressures, and being adaptable to changing requirements can, in itself, be a requirement.”
Bandwidth demands also push the limits of testing systems. “Technologically, in the domain of electromagnetic spectrum operations (EMSO), fielded systems are stretching towards wider bandwidths, higher frequencies, greater frequency agility, and more resilience against threats. Accordingly, [test and measurement] equipment must be able to generate and analyze waveforms with appropriate specifications, tune quickly, and create realistic scenarios that stress the unit under test in near-real operating conditions.”
Test systems also reduce long-term life cycle costs by finding flaws early in the design process before a system is deployed.
“A key aspect of delivering on time and on budget is having test strategies that can identify bugs early in the design process,” Twaits says. “Open-air range testing is costly, and not feasible or practical for testing early designs. In radar testing, for example, customers are looking for hardware-in-the-loop systems that allow realistic targets to be injected into the radar system under test. This allows them to test the system early and often, eliminating issues sooner and assessing the radar against a broad range of scenarios.”
NI offers the radar target generation (RTG) software, which enables customers to operate PXI RF vector signal transceivers (VSTs) as a closed-loop real-time radar target generator. It provides engineers with a single module that can act as a standard radar parametric measurement device and as an RTG that is highly capable and flexible to end-user adaptations. With a fully open list mode, users can define as many as 10 million test targets to sequence at hardware speed, to stimulate radars in ways that would be impossible to accomplish on an open-air range.
EW systems are about countering and detecting complex adversarial threats and the test systems about enabling the warfighter to leverage these systems not only efficiently but also safely.
“At its core, EW testing is all about keeping troops safe by ensuring both personnel and equipment are prepared to handle a vast array of electromagnetic threats,” Patschke notes. As the EW testing environment becomes increasingly advanced, customers need to generate the most realistic simulations possible. This can only be achieved by generating high-fidelity, dynamic scenarios that emulate real-world conditions. In the past, this required a large amount of equipment that often lacked any versatility in use. Customers now expect their equipment to not only demonstrate a higher level of capability, such as wider bandwidth and more output ports, but also to offer more flexibility in a more compact size. Keysight has addressed these expectations with a scalable, open-architecture EW test and evaluation portfolio including the newest M9484C vector signal generator. (Figure 2.)
[Figure 2 ǀ The M9484C vector signal generator from Keysight is a four-port source that also has the ability to produce pulse on pulse outputs. This single signal generator has the ability to replace four older signal sources.]
Fountain notes as a final comment on trends that “there is a desire to move from open-air range test to closed labs, mainly due to the complexity matrix of open-air tests, the costs and the possibility that RF emissions from tests may be intercepted by unwelcome listeners.”
Open architectures/MOSA initiatives
Fountain says he doesn’t see much activity for these initiatives at the test and measurement level. “There are some niche applications for measurement systems at the operational level, where the benefits and added costs of modular architectures such as MOSA [modular open systems approach] and SOSA [Sensor Open Systems Architecture] would be applicable, but in most cases the test and measurement equipment is in a lab, where a controlled environment is desired to provide a high degree of measurement accuracy.”
“In many ways, standard architectures like SOSA are taking a very similar philosophy in embedded design as NI took to test and measurement design with the modular PXI platform: to make systems that are modular, flexible, and interoperable,” Nelson says. “These three goals of modular open architectures are key to the success of future military embedded systems, allowing systems to be designed today and cost effectively upgraded tomorrow. The NI approach to test and measurement mirrors this goal. Having a modular, scalable, flexible, and upgradable embedded system means that test systems must also be modular, scalable, flexible, and upgradable to match changing requirements, capability, and interfaces. We believe that modular test systems that mirror the modular direction of open architecture initiatives will help deliver on the promise of this new embedded system philosophy.”
“Customers who invest in new products want to ensure that their legacy equipment and systems can function in tandem with upgraded platforms,” Keysight’s Patschke says. “This is not only a cost-saving measure but also one that reduces waste by extending the life of older products, while keeping the entire system up to date. Open architecture platforms make sustainability a priority without sacrificing the ability to upgrade down the line. Keysight has placed a premium on open architecture implementation when designing next generation systems.”
AI and software-defined test systems are paving the way for more capability right now and also for tomorrow’s radar and EW test systems for areas such as software-defined radar, spectrum sharing, digital twins, and more.
“One way the future of system testing for DoD [U.S. Department of Defense] customers could potentially evolve is through advancements in digital-twin technology,” Patschke says. “These systems leverage model-based systems engineering (MBSE) methodology to generate digitally realistic testing scenarios that often account for outside variables that prior virtual testing methods could not. The ‘digital-twin’ concept could in theory convert most, if not all, physical system engineering activities into virtual ones. Digital twins have the potential to add extensive value in situations where staging physical tests isn’t practical, and real-world effects are difficult to reproduce. As customers look for more reliable and cost-effective means of testing, the digital-twin option may become more attractive.”
Fountain says there are four key areas driving test and measurement technology over the next few years:
- Spectrum sharing: Spectrum bands are being redeployed to commercial applications such as CBRS (wireless networks), which leads to a requirement for more complete and precise coexistence testing.
- Software-defined radar: The move from analog pulsed radar to a fully digitally modulated radar, where every pulse can be modulated is already enabling communications between the radar and cooperative assets. But it’s not just radar and comms, but EW, both EP and EA, and [military communications] integrated into a single platform.
- Quantum sensing and quantum radar are still at the very early stage, but if these technologies can be made to work “in the field” they will change the very fabric of how conflicts are conducted.
- Moving from a traditional pulse descriptor word (PDW)-based environment generation to a higher-fidelity in-phase (IQ)-based system, which is driving a demand for higher-bandwidth RF generation capability.
Demand for flexibility and multifunctionality in radar and EW system is also becoming a characteristic of test and measurement requirements.
“We’ve seen many requirements that are demanding test systems to be like a Swiss Army knife: customers want test equipment to do everything in a single system,” Nelson says.
“We often get requests to configure systems that can do parametric test while performing system-level tests such as radar target generation, combined with the ability to conduct RF record and playback. These combined requirements make it hard to do this in a cost-effective manner while maintaining an acceptable size, weight, and power. This is only possible with modular systems that balance closed, specific functionality with the ability to extend capabilities with open software. The trend we are seeing is that modern test systems must be as multifunctional as the systems they are testing.”