Communications framework eases legacy stovepipe system integrationStory
April 29, 2011
Warfighters need data to be fused into useful information providing integrated situational awareness. Accordingly, a communications framework can ease legacy system integration woes.
The DoD has overseen a rapid expansion in the use of next-generation technologies during the past decade, driven by two active wars and the rapid evolution of electronics and computing capabilities. However, the inability of these technologies to be integrated and communicate with existing legacy environments is hampering the competitive advantage these systems provide. Data acquisition systems such as persistent Intelligence, Surveillance, and Reconnaissance (ISR) and unmanned ground, aerial, surface, and underwater vehicles highlighted the problem. However, a multi-layer communications framework that adapts for use with new technologies is needed to simplify legacy system integration.
“One standard? Yes, if it’s mine!”
Complex ISR, unmanned vehicle, and other rapidly fielded capabilities now play a major role in every military scenario. The United States built its technological advantage largely on the diversity and competitiveness of its supplier base. But now one of its greatest strengths has become its greatest liability. Systems provided by a broad base of suppliers use very different technologies. The costs and risks of integrating vastly different technologies are now escalating as innovation and military needs rapidly evolve.
The traditional approach to integration requires using singular standards to ensure certain systems can interoperate. However, this approach assumes that these conglomerates of systems can themselves be isolated. This is no longer true – interaction between different services and indeed between different countries in coalition warfare is now normal. Also, standards change, and often. Once a system is deployed, future versions will not be supported nor usable by that system. Another problem, often underestimated, is that standards have a tendency to represent a lowest common denominator and lag innovation. Vendors cannot easily adopt one standard where the specialty functions and closely held IP that differentiate each unmanned vehicle, ISR, or rapidly fielded capability are not supported.
An ideal scenario would allow all hardware and software, whether legacy or new, to leverage whichever standards they each require while operating within a Systems of Systems (SoS) framework economically, effectively, and sustainably. This would require independently produced systems to cooperate without significant custom modification. A mechanism is needed to integrate unrelated hardware and software components into one effective SoS, sharing information between highly disparate assets. Rather than enforce a single standard, it would embrace multiple, domain specific, incompatible standards.
Products have been available for a while that enable independently developed technologies to share information through a standards-based messaging framework. Each of these products has its own requirements relative to which set of standards should be used, along with methods for conversion to these standards. In this approach, the legacy and new systems can exchange information in a neutral format that is consumable. But what if standards conversions could be point to point and managed systematically instead of converting all formats into a neutral format?
A highly adaptable approach utilizes dynamically loadable modules to implement support for each layer of technology in the operating environment – supporting each of the specific hardware, operating systems, programming languages, and network protocols to deal with specific conversion requirements. Semantic data definition is also separated in this environment to support multiple formats.
Decomposition of these four layers of the operating environment permits abstraction of implementation specifics for the system designer. It also simplifies the work of the application developer as software code can be automatically generated by developer tools independent from the data definition.
Info exchange or info integration?
A communications framework with supporting development and management tools can simplify the way information is exchanged between components of a distributed architecture. This is achieved using a unique combination of two key innovations. First, the information distribution layer uses a structured and detailed open-definition information model incorporating each interface to link the system together and provide a conversion mechanism for disparate system components. This is done by using a structured, user-definable information model of each component’s interface. Then a distributed abstraction approach enables the distribution layer to operate across heterogeneous environments via a plug-in assembly that is entirely separate from the data type definitions in the information model. This approach is utilized by Spark’s Distrix, which uses an object-oriented model for distributed programming with data distribution performed through an underlying publish/subscribe mechanism. However, Distrix does not force developers, specifically those developing procedural code for embedded systems, to use object-oriented concepts. In stark contrast to traditional integration and migration approaches, the user defines each system’s standards in isolation. Semantic conflicts are handled separately, and the transport is automatically translated for use at each endpoint. This approach has been proven by the Office of the Secretary of Defense to significantly reduce time to deployment.
Chad Trytten is founder/CEO at Spark Integration Technologies. Contact him at [email protected]