Complex threats require complex & tested solutionsStory
September 13, 2021
Increased U.S. Department of Defense (DoD) funding is targeting radar and electronic warfare (EW) systems to counter complex threats such as electronic counter-countermeasures, to stay ahead of their adversaries. This demand is also driving innovation in the military test and measurement community to add sophistication and efficiency to the test and evaluation process to improve capability. Meanwhile, cognitive EW applications are inspiring new test designs.
The steady advance of electronic counter-countermeasures (ECCM) technologies places immense pressure on electronic countermeasures (ECM) to keep pace – or even a few steps ahead – in order to remain effective at protecting military assets.
Such complexity is fueling increased U.S. Department of Defense (DoD) funding for electronic warfare (EW) and radar systems to detect and counter these threats while deploying them to the battlefield and the warfighter more quickly. The latter demand places pressure on test and measurement system designers to look for innovative methods to enable efficient testing earlier in the design process.
Growth in the DoD’s budget for EW “appears to be contributing to a buoyant market for test and measurement,” says Jeremy Twaits, solutions marketer, Aerospace, Defense, Government Business Unit, for National Instruments (Austin, Texas).
“A Frost & Sullivan analysis earlier this year forecasts the DoD electronic warfare market will reach $3.6 billion by 2025, up from $3.17 billion in 2021. Research, development, test, and evaluation is the largest area of spending, which also signals significant opportunity for the test and measurement market,” he continues. “There’s increased pressure on primes to rapidly develop systems that don’t require multiple iterations in acceptance testing. This has pushed many to do more parallel development, so more testing is performed earlier within the design process. As new government contracts move away from cost-plus with longer delivery times, we’re seeing more cost pressure and more openness toward new flexible COTS [commercial off-the-shelf] solutions.”
ECM validation is “a high-priority item in the DoD budget,” says Darren McCarthy, technical marketing manager for Rohde & Schwarz America (Columbia, Maryland). “Bringing more capability out of the chamber or field test and making this available through COTS on the bench helps to reduce cost of test and enables rapid technology insertion.”
Simple reflections in a typical ECM, with digital radio-frequency memory (DRFM), now must consider the realistic multipath reflexive nature of the target being protected – such as radar cross-section returns from an airplane cockpit, wings, and tail at different time offsets.
“Emulation of rotary blade modulation (RBM) or jet engine modulation (JEM) are effects that an ECCM might have the ability to detect, and if the ECM does not properly simulate the specific techniques that the ECCM can distinguish, the ECM will become ineffective,” McCarthy says. (Figure 1.)
[Figure 1 | The SMW200A Vector Signal Generator from Rohde & Schwarz can be equipped with a maximum of two internal baseband modules four fading simulator modules. This concept yields two full-featured vector signal generators in a single unit.]
Full environmental simulators aren’t cheap – they tend to cost more than $10 million. “The cost of test time and access to the restricted areas these might operate within are all considerations driving the test and evaluation (T&E) of ECM validation,” McCarthy adds.
One ECM validation trend emerging – due to limited access to flight systems or fully functional radar environment simulators – is customers requesting more capabilities and sophistication for test and evaluation of ECM to reduce test cost and to speed up assessment.
“The test and evaluation of ECM is a continuous and rapidly evolving life cycle when new capabilities are discovered or suspected within the ECCMs,” McCarthy says. “With the increasing capabilities of COTS test and measurement equipment, users can now rely on COTS equipment in the integral parts of ECM validation.”
To work within the ECM validation life cycle, “they want COTS to have an open application programming interface (API) that can tie into their national database for threats – pulse descriptor words (PDW) and IQDW,” McCarthy adds. “They need access to unencrypted data and mass storage of large datasets. Finally, they need to operate with hardware-in-the-loop for both open-loop and closed-loop scenarios.”
National Instruments customers working on early-stage assessment of novel radar and EW capability are increasingly requesting better integration with modeling and simulation tools. “This not only supports digital transformation initiatives, with intellectual property (IP) being shared throughout the development cycle, but also accelerates the process of transitioning a new idea from the whiteboard to a proof of concept,” says Twaits.
These power users also want a greater degree of out-of-the-box functionality for software-defined radio (SDR) devices for faster testbed prototyping. “Our customers want to reduce the time spent working on elements of their testbed infrastructure, such as data movement interfaces and synchronization, so they can instead spend their time honing their waveforms or algorithms,” Twaits notes.
Defining more functionality within software
As radar and EW systems become increasingly software-defined, test and measurement solutions become more complex.
National Instruments gets requests for solutions that enable real-time simulation of radar targets with real-world environment conditions such as clutter, interference, and jamming: “Increasingly, customers need software-defined scalable test systems that can be upgraded as threats and operating scenarios evolve,” Twaits says. “Being able to maintain ownership and control of their test approach is especially important where the waveforms used or threats simulated are based on confidential IP.”
“Providing basic scenarios such as antenna scan patterns, Doppler, multi-emitter, and moving emitter simulations are some of the basic building blocks now offered in COTS test equipment,” McCarthy says. “This allows benchtop validation of the electronic warfare systems to ‘test as they fly.’ Additional functionality, such as simulating GPS constellations in the same platform, can also provide test validation of position, navigation, and timing systems associated with the EW devices.”
A low-latency, closed-loop system configuration of the COTS, such as a radar echo generator configuration, “can validate the functional radar performance for things like moving-target indicator (MTI) sensitivity and minimum-detectable signal (MDS),” McCarthy adds.
The industry is constantly evaluating how to take advantage of improvements or advanced processing capability, and Twaits thinks more of the system should be defined within a flexible software infrastructure. “Traditional electronic warfare test systems can’t model a threat in a true closed-loop way, so software models will need to change significantly, which will impact how we do system test and the hardware that can be used,” he says.
Low-latency, closed-loop requirements are driving more software capability that will reside within test hardware (FPGA), and “while traditionally this has been an expensive test hurdle, we continue to invest in new methods to move ‘test digital signal processor’ (DSP) from general purpose processors (GPP) to more capable test processing hardware,” Twaits continues. “We also expect emerging digital engineering trends to produce significant improvements to the entire test process. Obviously, the majority of the market isn’t at that point yet, but we believe this will drastically affect how software, specifically test, is executed.” (Figure 2.)
[Figure 2 | PXI-based, modular RF instruments from National Instruments provide scalability as new EW testing requirements emerge.]
Radar and EW systems that can be upgraded as new threats emerge – whether through software updates or machine-learned behavior – are more complex to test, Twaits points out. “And as assessing machine-learning/artificial intelligence (ML/AI) systems’ “software ‘testing’ significantly increases, it to a large degree becomes part of algorithm development,” he notes.
Open architecture trends
Demand for more functionality in software and more data sharing typically trends toward more open architecture designs. Open architectures like PXI Express provide several benefits to engineers testing radar and EW systems – including flexibility and scalability, built-in capability, and IP ownership.
As far as flexibility and scalability goes, “if adoption of new defense assets is slow, it eases the burden of both developing counter systems and test capability,” Twaits says. “Traditional test systems could be open-loop because adversarial systems had much more static behavior, but as new threats with artificial intelligence emerge at a much faster pace it becomes vital for the test system to adapt quickly with appropriate test capability. In a large majority of cases, it’s cumbersome or even impossible to have COTS vendors implement the necessary test capability. Because of this, open architectures are required to develop the appropriate test in a timely manner.”
In terms of built-in capability, “open architectures typically provide infrastructure benefits that save the engineer from spending time and effort building it themselves,” he explains. “For example, PXI Express uses PCI Express on the backplane for high-throughput data movement, which is critical for shuttling wide bandwidth I/Q data from RF I/O to processing nodes. It also provides triggering and synchronization capability across multiple modules and chassis to allow phase coherence across a radar array.”
Moreover, defense contractors “often need to maintain control of their test systems, not ceding ownership to T&M [test and measurement] vendors, especially where classified waveforms or scenarios are involved,” Twaits says. “Open software-defined architectures allow the organization to build on COTS hardware, while keeping control of confidential IP.”
Open architectures on the embedded side – such as SOSA [Sensor Open Systems Architecture] and FACE [Future Airborne Capability Environment] – “simplify system upgrades by defining common hardware and messaging interfaces,” he notes. “This fixed interface simplifies the integration of new processing elements and allows solutions to include a mix of vendor elements focusing on best of breed, rather than best of what the system can accommodate. In the world of test and measurement, this is a blessing and a curse. Having a fixed interface simplifies the testing infrastructure required to characterize a system, but that ability to rapidly upgrade can present challenges because required test scenarios may need to evolve with new hardware elements. To meet this challenge of a common infrastructure with ever-evolving capabilities, a flexible software-driven test infrastructure is needed.”
One of the biggest limitations of open architectures “is the compromise on the availability to get leading-edge performance,” McCarthy says. “Today, some open architectures have standardized on 10 Gb IQ streaming data rates, while advanced COTS can already provide 40 Gb IQ streaming data rates and 1 GHz of bandwidth. The standards driving high-speed interfaces are not driven by the test and measurement community or the radar/electronic warfare test community, but rather the cloud computing and data center markets.”