Warfighters harnessing today's mobile graphics technology: The impossible dream or nearly reality?Story
September 23, 2009
The exploding ubiquity of personal computing devices such as the iPhone or Sony PSPs gives a glimpse into the computing power consumers can now hold in their hands. These devices provide mapping, communication, video processing, and network data retrieval and allow connection and collaboration in ways never before imagined. So why has the military COTS community been so slow to harness these mobile technologies for the warfighter?
No technology evolution in the past 10 years has shaped the way we interact with computers and information more than mobile multimedia platforms. Consider the average well-equipped consumer in the United States. In their car is a navigation map showing route guidance, their surroundings, local gas stations, and, if they are lucky, live traffic information. In their hand is an even more remarkable device, their mobile multimedia phone such as an iPhone or Sony PSP. In addition to offering the same mapping capabilities, this device might play 3D games with more graphical complexity than the average military map system, communicate both text and voice, play videos, and access the Internet.
This mobile graphics revolution has resulted from various factors, and consumers (and to some degree, businesses) have benefitted enormously. However, warfighters, especially in the U.S. military, have not seen this radical transformation for their computing equipment and capabilities. In many ways, the warfighter’s computing architecture and methods are several years behind the times. The warfighter is still apt to be carrying the equivalent of a shrunken desktop computer just for mapping or to access videos or operate an unmanned platform. Vital to this equation are the hardware and software technologies behind the mobile multimedia revolution – and consideration of why our warfighters are not benefitting as they should and how the military COTS industry can improve.
Mobile graphics – The hardware
The key to the performance of mobile multimedia devices is matching the processor type to the task. Just as important is to provide a well-defined software environment and a strategy that leverages it. These processor types include:
- General purpose processors, optimized for sequential program execution. In the mobile processing world, these are typically low-power ARM RISC processing devices. These devices might be employed in scalable configurations.
- Dedicated graphics processors, optimized to provide rendering and media processing for standards-based media formats. These include 3D graphics or video data conforming to standards such as MPEG4 or the Khronos Group’s (www.khronos.org) OpenGL ES for embedded systems. Mobile graphics accelerators can be employed as general purpose pixel processors, through the use of pixel and vertex shaders and languages such as OpenCL. Graphics processors are useful for such tasks as UI display, digital mapping, and data fusion. Some common mobile graphics accelerators used in mobile technology include Imagination Technologies’ (www.imgtec.com) PowerVR MBX and SGX series, NVIDIA’s Tegra and mobile GoForce/GeForce GPUs, AMD’s Imageon, and many others.
- Pixel-programmable processors such as FPGAs, massively multicore processing chips, and DSP processors. These are optimized to process large amounts of data in parallel and are ideal for special-purpose tracking and image-processing algorithms, communications, and rapid data I/O required for systems processing visual media.
In mobile devices, these components are combined into System-on-Chip (SoC) or System-on-Module (SoM) architectures that further reduce power and footprint. Hardware design often includes IP blocks for assembly within these systems. Several SoC architectures are available that provide very good mobile processor and graphics combinations that can be used to build mobile devices, and these technologies can also be used in military computing systems such as handheld UAV controllers, handheld mobile battlefield communication and shared situation computers, and onboard avionics or vetronics display systems. Some examples of common SoC architectures that offer new capabilities in interesting packages are Freescale’s 5121 combining a PowerPC with a PowerVR GPU, and NVIDIA’s ION platform combining Intel Atom with a low-power GeForce GPU.
Mobile software development
When Apple released its iPhone SDK to the software development public in 2008, nothing less than a new software subindustry was born. Micro companies formed to build games and applications for the device. Even technology with potential relevance to the warfighter, such as mobile geospatial applications and handheld simulators, has been developed and deployed to consumer and automotive marketplaces. In exchange for more universal acceptance of its devices (and a piece of the action), Apple unleashed the capabilities of its mobile media platform, offering a well-supported application framework, developer resources, and support. Their model deserves careful scrutiny by the military software development marketplace (Figure 1).
Figure 1: Laminar Research’s X-Plane iPod Edition (pictured) highlights mobile graphics technology with potential warfighter relevance.
Mapping capabilities are another area where mobile software development techniques offer advantages for potential military use. Mobile platforms stress vector-based graphics, and are therefore able to achieve high compression ratios and utilize data sources such as street-level data from TeleAtlas. Also, mobile navigation platforms have adopted certain standards for symbols and data interchange between applications, enabling geotagging and related forms of entertainment (Figure 2).
Figure 2: Vector-based graphics, such as those utilized in Quantum3D’s Mobile IFE with 1 Meter Imagery on NVIDIA ION, could prove useful in warfighter venues.
The military software development landscape
However, the military software industry has been slow to adopt the same kind of methods as the mobile multimedia industry. Software development for military platforms is primarily oriented toward desktop computing platforms or their embedded variants. As an example, at a recent industry meeting, the government distributed the software requirements for a handheld UAV control platform needed to support its chosen mapping toolkit. Their requirement called for processor and graphics performance benchmarks that could only be met with the equivalent of a dual-core desktop processor. These requirements were derived from a digital mapping engine chosen by the government for the effort.
Many military software programs suffer from this kind of software-driven lack of flexibility. In a world where hardware changes constantly, the software, oddly enough, has become much more resistant to change. This is understandable in some ways, since software development costs are such a large percentage in major weapons systems. Unfortunately, however, software techniques that might be used to leverage mobile multimedia technology for the warfighter are not commonly employed in the development of military systems. But this condition need not remain unchanged.
Closing the gap – Addressing the development challenge
To close this gap, the military, its contractors, and the COTS embedded community that serves it can begin to do business differently in several areas:
- Software development methods can be changed. Software toolkits can be offered that leverage mobile technology. Why not a mobile multimedia military SDK toolkit, similar to the iPhone SDK, designed from the ground up to take advantage of mobile graphics technology? Such a toolkit could put a set of reusable software in developers’ hands to enable the rapid prototyping and development of applications designed to run on a new class of mobile multimedia-aware devices.
- Field devices should be based on mobile multimedia technology. Why not SoC-based avionics displays in military vehicles? The technology is certainly capable of doing the job. But, too often, a risk-averse military development community ignores such technology as it specs out its hardware solutions. Similarly, why not use more mobile-multimedia platforms for ruggedized handheld display systems and wearable computers? Too often, these systems are just shrunken desktops, which as a rule do a poor job of matching processor watts to pixel throughput performance for digital-media-based applications (resulting in both poor performance and overlarge power requirements).
- Develop the technology for mobile multimedia the same way the military uses its Science and Technology (S&T) development mechanisms for other new technologies. Look at the way, for instance, the military has adopted novel development methods for unmanned technology, using SBIRs, small companies, DARPA grand challenge, and wartime deployment of new technologies to accelerate the development of UAVs. Such a strategy could also be employed to bring mobile capabilities to warfighters.
- The military community should change the way it thinks about processors. Too often, processors are specified out, based on raw processor performance, but overall system performance is not considered. A 10 W SoC running a PowerPC or ARM core and a mobile GPU core might be an excellent choice to deploy on an embedded display system. But unless the spec allows bidding such a solution, it will not be considered; additionally, what might have been a 5 W solution will end up consuming 30 W to supply the same display.
In the end, military adoption of new technology such as low-power graphics is about changing the way military COTS vendors do business. But the changes are not radical. Having the power of mobile pixel processing onboard will enable the soldier of the future to sense more, interpret more, and share more information than ever before. The ability to process information on the fly, reason from it, and collaborate while using it will be key to fighting the new battles we are sure to face in the future. Consequently, the military COTS industry must harness low-power graphics technology to get there. But technology development does not need to occur: The proof that these warfighter-ready technologies can be done is already in our hands – and in our vehicles.
Mark Snyder is a director of embedded applications with Quantum3D, Inc. Prior to joining Quantum3D, he was an engineer at Honeywell International. He also spent nine years as an Air Force officer, where he was involved in 3D virtual simulation and visualization research at the Air Force Research Lab and engineered C4ISR systems at Air Force Space Command. He is inventor on several patents in the avionics flight deck display arena and holds a BS in Computer Science from Arizona State University and an MS in Computer Science from the Air Force Institute of Technology. He can be contacted at [email protected].
Quantum3D 602-336-2443 www.quantum3d.com