Military Embedded Systems

Trial by fire: Training for the ISR flight

Story

October 10, 2012

Justin Snider

General Atomics Aeronautical Systems, Inc.

Part math and part art form, sensor planning should not be confused with the standard flight or mission plan.

BASE X – 0200hrs ZULU – 2008: “Your mission plan looks horrible,” barked Carlos. “Why on Earth would you place the third flight line coming in from the east when your pilot will exit the second flight line on the west side? It’s all about getting the aircraft on the line in an efficient manner that makes life easy for the pilot without sacrificing your depression angle! If you do, your imagery is going to be toast.”

Having been in Iraq only a few days, I was struggling to find my rhythm in sensor planning for a Synthetic Aperture Radar (SAR) aboard a Predator-class Unmanned Aircraft System (UAS), flying counter-Improvised Explosive Device (IED) missions in support of a U.S. Army task force. My many years of Intelligence, Surveillance, and Reconnaissance (ISR) experience were doing nothing for me on this cold evening at BASE X inside a small Ground Control Station (GCS) with a two-person, young Army UAS crew (pilot and sensor operator) and my reluctant instructor, Carlos, who reminded me, “It’s an art form, man! Get out of the seat and let me show you some magic.”

Few understand that the cornerstone of good imagery collect, especially for Coherent Change Detection (CCD), is precision sensor mission planning. Part math and part art form, sensor planning should not be confused with the standard flight or mission plan. Sensor planning is focused on ensuring that the imaging sensor and aircraft are in the proper point in the sky and looking in the proper direction, all at the right time, to ensure the best chance of success for a good imagery collect. On paper it sounds easy, but throw in poor weather conditions, closed kill-boxes, a poorly constructed target deck from an overworked collection manager, and constantly changing altitude stacks and an easy day just went out the window. Thankfully, a new sensor control and exploitation software system takes care of figuring out aperture timing and other radar-specific complexities. And, when combined with a hands-on multisensor ISR simulator called the Tactical Airborne Reconnaissance Simulator (TARS), simulation of UAS flight paths and waypoints based on sensor collection parameters is reduced to a (much easier) science.

We need something to train with

How is one trained in the art of sensor mission planning? My training was done either on a live flight aboard our company-owned Twin Otter aircraft in San Diego at the tail end of a production radar burn-in event, or at my desk running a version of Claw, which is the GA-ASI-developed sensor control and exploitation software mentioned earlier. The problem at the time was that even if I built a mission on the ground, there was no way to run the plan with a live sensor to see if it would work and actually provide valid, usable imagery. Time, availability, and cost were, of course, prohibiting factors in allowing our entire sensor operator cadre to spend instruction time either airborne or in a GCS linked up to a UAS.

Thankfully, while I was deployed to Iraq running our system in combat, a team of software engineers back in San Diego was busy working with SDS and SAIC to develop TARS, an inexpensive, hands-on multisensor ISR simulator. TARS was born out of the need to train our deployed operator crews on a realistic simulator that could accurately emulate the conditions and flight parameters seen in a GCS during a typical ISR mission flown in theater by the Army or Air Force (Figure 1).

 

Figure 1: A sensor operator trains on the TARS system.

(Click graphic to zoom by 1.8x)


21

 

 

The beauty of TARS is that a student or seasoned operator can practice with a highly realistic simulation without having to step on an airplane or find a GCS hooked up to a UAS (Figure 2). TARS provides three basic software components loaded onto a COTS workstation: 1) Remotely Operated Vehicle Adaptable Training/Tracking Systems (ROVATTS), provided by SDS and controlled by a Thrustmaster joystick, provides the background UAS flight track simulation and simulated Electro-Optical/Infrared (EO/IR) Full Motion Video (FMV); 2) GA-ASI’s Claw system, which provides sensor mission planning, command, control, visualization, and exploitation for the FMV and Lynx Multi-mode Radar payloads; and 3) RADSIM, which is furnished by SAIC. The RADSIM, utilizing a proprietary process, uses overhead imagery to produce high-fidelity synthetic SAR imagery products that include terrain features, texture, and radar shadow, all consistent with real-world SAR imagery. Various SAR resolution levels are available to the student, enabling him to practice utilizing low-resolution SAR for a wide-area search followed by high-resolution imagery to refine specific targets of interest.

 

Figure 2: Simulated SAR and EO imagery

(Click graphic to zoom)


22

 

 

Bringing it all together – A typical training scenario

A typical training scenario entails the student receiving a mock target deck that details a specific ISR request. These targets are applied to the map in the sensor control and exploitation software to ensure the student can make sense of the request. The student double checks that the target points match up to roads, specific buildings, areas, and points of interest, just as is done in the real world. Once satisfied with the tasked targets, the student builds the sensor mission plan for the SAR collect in the sensor control and exploitation software portion of TARS. The student creates imaging points and paths and then provides specific numerical inputs for the SAR sensor geometry, to include desired depression and squint angles, collection altitude, and SAR image type and resolution. The sensor control and exploitation software will then take the sensor collection parameters and autogenerate the flight path and waypoints for the aircraft to fly. The instructor now places the student-generated flight waypoints into RADSIM, and the simulation is started.

The simulated aircraft moves along the flight path generated by the student’s inputs, and the SAR payload is automatically triggered at specific points along the path to image the ground targets. The student will see indications in the sensor control and exploitation software that the SAR aperture is firing, as well as image processing and formation signals. The artificial SAR images will populate on the map and in the exploitation window in the representative time that it would take during an actual mission. The student can practice first-phase analysis of the imagery and identify any items of interest. During aircraft repositioning, the student might want to utilize Ground Moving Target Indicator (GMTI) – a sensor capability utilized on numerous manned and unmanned aircraft throughout the DoD – in an attempt to identify moving ground targets. TARS includes a host of moving vehicle targets, to include trucks and tanks that the student attempts to locate by using the simulated radar set to “GMTI mode” to search large areas on the ground in a sweeping arc pattern, looking for the telltale “dots” on the screen, identifying a moving target. If a moving target is found using the GMTI sensor, the student can cross-cue the FMV sensor to the GMTI-identified target for further target identification and tracking.

The ISR simulation beat goes on

In a nondescript office building in the San Diego area sits GA-ASI’s Test Operations Center, a working mockup of a typical operations center that can be found overseas, complete with numerous large LCD monitors displaying maps and various examples of sensor imagery. Here, the training process described is repeated daily. A student sensor operator is handed a slip of paper with 15 ground targets. He has 30 minutes to interpret the ISR requirement and translate it into a workable solution that will enable a UAS to collect from the critical intelligence points properly. TARS allows the student to run this drill to perfection before heading overseas to do the same thing, deployed in theater with soldiers on the ground who are counting on timely, relevant intelligence data.

Vince Lombardi said it best, “Practice does not make perfect. Only perfect practice makes perfect.” With ISR operations, second chances are far and few between. TARS ensures that those who are charged with actually employing a sensor will be ready for the real thing.

Justin Snider is a Program Manager at General Atomics Aeronautical Systems, Inc.’s Reconnaissance Systems Group (RSG). He directs contracts providing manpower and technical support to various customers; he also manages the RSG Tactics and Training Department. Justin serves as a USAF Reserve Intelligence Officer and has deployed on multiple tours to Iraq and Afghanistan.

General Atomics Aeronautical Systems, Inc. (GA-ASI) www.ga-asi.com

 

Featured Companies
Categories
Unmanned - ISR
Topic Tags