Military Embedded Systems

Case study: LiDAR system provides helicopter pilots a clear line of sight in brownouts


June 24, 2008

Case study: LiDAR system provides helicopter pilots a clear line of sight in brownouts

Today?s combat zones are inundated with threats - enemy fire, Improvised Explosive Devices (IEDs) and mines are a few of the imminent dangers that each war fighter faces. One threat that has become a major focus in recent years is vision obstruction in a helicopter's landing zone, caused by brownouts and whiteouts. However, a new modified LiDAR system is providing 3D images to help increase pilots? situational awareness.

As the war on terrorism continues, helicopters play a crucial role in both combat and civilian missions, in everything from medical evacuations to crew transportation. Due to the threat of IEDs, helicopters are now becoming the preferred method of travel to and from mission coordinates. But in arid environments like Afghanistan, landing visibility is constantly obstructed by brownout conditions.

"Brownouts" occur when a helicopter takes off or lands in sand or onto dust-covered sites, while snow-covered sites produce "whiteouts." The spin of the helicopter's rotors cause clouds of dust (or snow) particles to form in the air, obscuring the pilot's view and, consequently, their situational awareness. When pilots do not have the visual cues they require to safely land the helicopter, the results can be fatal.

In recent years, the U.S. Army has recorded more than 40 cases of brownout conditions causing accidents at various training facilities within the U.S. In conjunction with in-theatre operations, this number jumps to 230 cases of aircraft damage and/or injury since 1991. Between 2001 and 2007, 80 percent of the accidents happened during landing procedures, while only 20 percent occurred upon takeoff. Brownouts are costing the U.S. an estimated $100 million per year.[1]

Investigating solutions to this problem has become a high priority for the military. Until recently, no definitive solution has been developed. However, a new modified LiDAR vision system is emerging, the Obscurant Penetrating Autosynchronous LiDAR, known simply as OPAL. Its early prototypes have been tested in a variety of environments and have proven to be robust and powerful.

OPAL's ability to penetrate brownouts and whiteouts is proven more effective than conventional LiDAR. When used in conjunction with an infrared camera and a terrain database in a system such as Augmented Visionic System (AVS), a powerful synthetic vision system is created for helicopter pilots - enabling a line of sight through brownout and whiteout conditions and ensuring safe takeoffs and landings.

OPAL versus conventional LiDAR

Several systems exist that are aimed at improving helicopter pilot visibility. Although these sensors have some merit, they are inefficient in brownout and whiteout conditions. For example, infrared cameras are effective in poor visibility conditions, such as fog, but are limited when operating in dust clouds holding particles of size comparable to the operational wavelength of the camera. Additionally, Millimeter-Wave (MMW) radar, flash LADAR, and Range-Gated Cameras are also ineffective within brownout and whiteout conditions. For example, because of its longer wavelength, MMW radar penetrates deeply into a brownout or whiteout but has poor spatial resolution, failing to clearly define the target image. The flash LADAR and the Range-Gated Camera work together to create a full Field-Of-View (FOV) in one laser shot pulse. Although this provides a high resolution, it lacks the ability to penetrate deep inside aerosol clouds because the light sources in these devices have to be spread into the FOV for each shot pulse.

Additionally, there are several systems on the market aimed at improving helicopter pilot visibility. While various combinations of Infrared (IR) cameras with synthetic terrain databases provide effective solutions to deal with environmental conditions and pollutions, they are not always efficient in brownouts and whiteouts. Akin to driving in a thick fog with high beams, when dust clouds contain particles of a comparable size to that of the operational wavelength of the camera, the vision system cannot penetrate the aerosol cloud to determine if there are any objects within it.

Many active sensors, such as LiDARs, have the ability to penetrate further into brownouts and whiteouts than passive sensors. Emitting their own energy, LiDARs use a laser shot pulse to gather data. Using Time-Of-Flight (TOF) - that is, measuring the time it takes to travel back and forth from the objects/targets in its path - the image of a flightpath or a landing site is created by gathering the sensor data and using the dataset to create a 3D image or model.

Because conventional LiDARs are triggered by the rising edge of a return pulse without a separated pulse from the target buried in an aerosol, this LiDAR can only report on the range of the closest aerosol under the brownout or whiteout situation. The OPAL was, therefore, developed specifically to address this problem.

How the system works

OPAL offers a higher signal-to-noise ratio than conventional LiDAR, resulting in a higher probability of detection and/or a greater range of depth capability. Its bistatic optical design provides robust results, and a proprietary design enables OPAL to scan full FOV and acquire 3D data quickly - something traditional active sensors using bistatic designs have been incapable of. Using OPAL with an IR camera and a terrain database in a system like CAE's AVS, pilots gain a powerful synthetic vision system.

The terrain database must be pre-populated using data collected by the AVS system on an initial scan, or using data purchased from an organization dealing in cartography, or a combination of both. With the ability to accurately map a highly detailed and accurate representation of the helicopter's landing zone prior to and during descent into a brownout or whiteout, this system presents a stable heads-up/heads-down view of the world around the aircraft to the pilot for optimal situational awareness during hover, landing, and takeoff situations.

This is critical as the terrain database creates an accurate geo-specific representation of the world, and the information gathered by OPAL and the IR camera is compared to the information contained within the database. Quite simply, the scanned areas are overlaid onto the synthetic world and changes are quickly and easily detected.

This type of synthetic vision system provides the pilot with complete perception of the physical/geographic environment. When combined with a helmet mounted vision system in a heads-down display, the pilot gains an almost infinite, instantaneous field of regard, independent of the current geographic conditions.

Testing for helicopter systems

Initial tests using the Aerosol Research Corridor at Defence Research and Development Canada (DRDC) in Valcartier, Quebec, involved comparing OPAL's performance to that of a variety of passive sensors including the human eye (visible camera) and an IR camera.

The aerosol chamber, which is a long, narrow building, hosted visible and IR targets. The visible target was a board with black and white stripes for the visible camera to focus on. The IR target for the IR camera was a frame with heated bars. The targets were placed at the back of the chamber, and the doors were shut to disperse the aerosol within the chamber. The visible camera, IR camera, OPAL, and transmissometer were located about 100 meters away from the chamber. When the doors opened, the sensors began gathering data within 0.5 seconds and continued to collect data until the density of the aerosol cloud became too thin to gather further measurements.

A detection factor - the defined parameter that is the ratio of aerosol density at the moment of target detection by the OPAL or IR camera to the aerosol density at the moment of target detection by the visible camera - was used to compare OPAL's performance to the passive sensors.

The OPAL penetrated farther than the eye and the IR camera under sand, dust, and fog aerosol conditions. In fact, the OPAL was able to see through 50 micrometers of sand dust that is 4 times denser than what the eye penetrates, through 6 micrometers of dust which is 6.6 times denser than what the eye can penetrate, and through fog that is more than 7.6 times denser. This is attributed to OPAL's timing discrimination, which reduces the scattering effect. Nevertheless, the results demonstrated that the OPAL can be used to detect obstacles in brownout conditions more effectively than the other options tested.

To further evaluate the system prototype under regular and whiteout conditions, a test flight was carried out in February 2007 at Crash Lake, just north of Ottawa, Canada. The AVS system was mounted under the National Research Council's Canada Bell 412 helicopter, which then flew to a landing site consisting of an open field. The center of the field featured an area with bushes and rocks, while the field was surrounded with trees on the one side and a flat frozen lake on the other. The system was used to scan the landing site under different helicopter maneuvers: A push-broom scan was used in fly-by or approaching operation, and a raster scan mode was used in hovering operation.

To create a whiteout condition, the helicopter hovered close to the ground in order to generate snow clouds with its rotor motion. In this test, the OPAL, developed by Neptec Design Group, and AVS provided outstanding results. The ground and trees behind snow clouds are clearly visible to pilots as shown in Figure 1. The bottom part of Figure 1 shows the side view of the 3D OPAL image under whiteout conditions.

Figure 1

(Click graphic to zoom)



The results of the pushbroom scan clearly illustrate the 3D landscape of the fly-by path and the tree and obstacles on the landing site (Figure 2). This demonstrated the OPAL's ability to mechanically sustain the vibration during flight while integrated as the active sensor of the AVS system. It also indicated that the OPAL data and the helicopter navigation data can be sucessfully fused together to generate geo-referenced 3D images.

Figure 2

(Click graphic to zoom by 1.4x)



Gaining a clearer perspective

A very specific application, the OPAL/AVS system was developed explicitly to combat the detrimental effects of brownout and whiteout conditions. OPAL's unique capabilities provide pilots a clear view of what is in their landing zone. The visibility and situational awareness is critical in terms of landing or taking off, thereby preventing accidents that can sometimes prove deadly.

Maureen Campbell is the technical marketing specialist at Neptec Design Group. With more than 10 years of experience in the technology industry, she works with the research and development team at Neptec. Maureen has been with the company for three years. She can be reached at [email protected].



Featured Companies


302 Legget Drive, Suite 202
Kanata, Ontario K2K 1Y5
Avionics - Synthetic Vision
Topic Tags