Military Embedded Systems

MOSA systems: The benefits of deploying a datacentric architecture

Story

March 21, 2023

Andre Odermatt

Real Time Innovations (RTI)

U.S. Air Force photo/Staff Sgt. Joshua King

A Modular Open Systems Approach (MOSA) – the method recommended by the U.S. Department of Defense (DoD) for the implementation of open systems – uses a decoupled system architecture that enables components to be incrementally added, removed, or replaced throughout the life cycle of a platform, providing opportunities for enhanced competition and innovation.

The U.S. Department of Defense (DoD) recognizes the importance of data and considers data as a strategic resource. A datacentric architecture is uniquely able to accomplish this goal because it not only enables systems to be decoupled and independent from each other, but also focuses the systems on the data itself. This approach supports both MOSA [Modular Open Systems Approach] and DoD data strategies. A datacentric architecture makes data the common element for systems to act on and consume, while also producing data that other systems can use such as uncrewed aircraft system (UAS) payloads generating ever increasing amounts of intelligence, surveillance, and reconnaissance (ISR) data.

A datacentric architecture can be achieved by using open standards, which are at the core of MOSA. Current DoD standards, including the Future Airborne Capability Environment (FACE) Technical Standard, already embrace a datacentric architecture by emphasizing the importance of data. For example, the FACE approach encourages the use of a data model and requires FACE Units of Conformance (UoCs) to have a well-defined data model that can provide the structure to communicate through the Transport Services Segment (TSS) of the FACE architecture.

A datacentric architecture is an excellent use case for a publish-subscribe architecture, based on the open Data Distribution Service (DDS) standard, which also emphasizes the importance of data. Individual systems declare the intent to subscribe to data or advertise the data they have to offer. The publish-subscribe protocol then matches publishers and subscribers with each other and makes sure the data gets to the right place at the right time. New participants can join at runtime without the need for code modifications, enabling rapid updates and technology refreshes using the existing data architecture.

Datacentricity and the system

Most software developers and system architects take an application-focused view on defining systems. They start off with defining the applications which are needed in a system (e.g., a GUI, processing, and the like). The next step is defining what the interfaces should be, and what is needed by each application. The interfaces keep evolving as new functionality is added and in most cases requires additional information. However, what really matters is the data, not the application; the application is simply a way to display the data of interest. The same data is as important if it is displayed in a different application. For example, revenue numbers can be viewed in Excel, Word, Notepad, Vi, etc. The way the application displays the data may make it easier for the user to understand; however, the data – and not the application – has the key information and not the application. The information is in the data. (Figure 1.)

[Figure 1 | An application-centric approach to a system is shown.]

In contrast, datacentricity puts data at the center of the architecture; in other words, data is the primary asset of the system. (Figure 2.)

[Figure 2 | A datacentric approach to architecture is shown.]

Realizing the importance of data, FACE requires a data model which provides a standard method for defining the data format and meaning for information-sharing between software components (e.g., FACE Unit of Portability (UoP)). The FACE data architecture is a key component of the FACE approach and is essential to designing systems and developing interoperable UoPs.

The DoD is rapidly transitioning to a datacentric enterprise. Datacentricity enables decision dominance at the speed of war and supports DoD efforts such as Joint All-Domain Command and Control (JADC2) that integrates sensors to command and control (C2) to effector systems across different operational commands, from the tactical edge to the cloud. This aligns exactly with the DoD Data Strategy announced in September 2020 that directs DoD leaders to evolve all DoD assets into datacentric assets that treat data as a weapon system.

In a datacentric system, the design starts with the data model. What data do I need, and how will it be represented? Different components can access shared data and communicate through shared data. Access rules can be put in place so only allowed and authenticated users can access the data. Since the data scheme is defined and all applications act on the same data, integration of new components is minimal and in most cases no data transformation is required. An example of a datacentric architecture is a database architecture. A common database schema is created and accessed through a defined protocol. Another advantage of this approach is that there are no dependencies between applications as the applications do not directly communicate with each other. All communication is done through accessing (reading or writing) data.

For a real-time system, it is not practical to exchange data by writing and reading data into a central database. In addition to the latency, the database server would be a single point of failure. To facilitate communication in a real-time system, the DDS standard, managed by the Object Management Group (OMG), was developed. DDS not only supports a datacentric architecture; it also uses a simple publish-subscribe paradigm. As was established earlier, a datacentric approach focuses on data and has a data model. DDS requires a data model and is aware of the data being exchanged. In addition, the publish-subscribe paradigm with the discovery feature allows new applications to subscribe or publish any of the data defined in the data model.

The benefit of the publish-subscribe system is that the application does not need to know where to send the information or from where to request the information. To illustrate this, look at a classic message system similar to email: As a participant, I must email a status report every week, I need to know whom to send it to, and I need to list all the recipients. If there is a new person who is also interested in the weekly status email, that person needs to let me know so I can add the address to the future status-report emails.

Publish-subscribe works more like social media, where I post my status and whoever follows me receives the update. If a new person is interested in the status, that person can follow (subscribe) to my status and automatically get the updates. I do not need to know how many followers I have. All I do is post (publish) my status update. DDS has another advantage: Since it is datacentric, I can apply filters on any of the data elements. Instead of getting a notification of every new status report, I now get updates only when a filtering criteria is met. Instead of getting every status report, I only get the status reports which have relevant information. The filter criteria can be changed at any point if the data of interest changes.

The Data Distribution Services (DDS) standard

Let’s take a closer look at DDS. The DDS standard is a set of standards which define the application API as well as the real-time publish-subscribe (RTPS) wire protocol that enables a transport-agnostic, open standards-based capability to rapidly connect applications from a wide range of programming languages, operating systems, and computing platforms. DDS enables the user to separate transport from the application logic. The DDS standard currently includes the following:

  • High-performance, scalable, secure, and datacentric publish-subscribe abstraction.
  • Completely decentralized architecture with dynamic discovery service that automatically establishes communication between matching peers.
  • Rich quality of service (QoS) characteristics for control over every aspect of data distribution, such as data availability, resource usage, reliability, and timing.
  • Interoperable data sharing, platform-independent extensible data modeling, encoding, and representation.
  • Recent extensions for remote procedure calls (RPC), security, resource-constrained devices, web integration, and OPC UA integration.

DDS provides data-sharing that is controlled by QoS policies. Applications communicate by publishing and subscribing to DDS topics identified by their topic name. Subscriptions can specify time and content filters and receive a subset of the data being published on the topic. Different DDS domains are completely independent of each other, as they do not share any data communication. In the case of UDP, this is achieved by using separate ports for each domain. There is no data-sharing across DDS domains. Figure 3 is an example of DDS communication using a single domain.

[Figure 3 | A diagram shows the datacentric Publish-Subscribe approach.]

DDS participants can be on the same machine or across a network. In either case, the application uses the same DDS API for communications. Because there is no need to know or configure IP addresses or consider the differences in machine architectures, the process of adding additional communication participants on any operating system or hardware platform becomes an almost trivial task.

How DDS and datacentricity support MOSA

MOSA is a technical and business strategy for designing affordable and adaptable systems. The five principles of MOSA are:

  1. Establish an enabling environment
  2. Employ modular design
  3. Designate key interfaces
  4. Use open standards
  5. Certify conformance

The first principle of establishing an enabling environment is really a business practice which supports the program manager in deploying MOSA throughout the acquisition life cycle.

MOSA requires a system architecture that allows several major system components at the appropriate level to be incrementally added, removed, or replaced throughout the life cycle of a major system platform, to afford opportunities for enhanced competition and innovation. Using a datacentric approach addresses principles two and three. A datacentric architecture inherently enables a module design as all components access a common set of data which also serves as the interface description. Creating a data model or using a standard data model is the foundation of a successful datacentric MOSA system. This can be a separate domain-specific data model or can be part of a specification (e.g., UMAA [Unmanned Maritime Autonomy Architecture] or ROS [Robot Operating System libraries]).

Using open standards is another key principle. With open standards, application developers are not locked into a specific technology or vendor. The market decides which technologies are the most viable. Open standards establish protocols and building blocks that can help make applications more functional and interoperable which streamlines product development. Many of the open standards have multiple revisions and have improved from lessons learned over time. Using open standards is much more cost- and time-effective than new development. DDS is an example of an open standard that fits perfectly in many of the MOSA applications. The DDS standard has an active and vibrant community continuously working to extend its applicability. The DDS standard currently includes the following:

  • Core specification (DDS, RTPS, DDS- XTypesTM, DDS-SECURITY)
  • Language mappings
  • APIs: C, C++, Java, and others
  • Gateways (XRCE, Web, OPC)

[Figure 4 | A timeline of individual data distribution services (DDS) specifications is shown.]

DDS is also used in many other standards including FACE, UMAA, ROS, and GVA. Figure 5 shows some of the standards where DDS is used.

[Figure 5 | Shown: Some of the standards where DDS is used.]

DDS use cases

DDS and RTPS packets are used for mission-critical applications, connecting embedded systems as a reliable communi­ca­tion layer between sensors, controllers, and actuators. DDS runs many of the most complex systems in the world: It powers the largest energy plants in North America, connects perception to control in autonomous vehicles, coordinates combat management in defense systems, drives a new generation of medical robotics, controls hyperloop and flying cars, provides 24/7 medical intelligence for hospital patients and emergency victims, and more.

At the TSOA-ID 2022 conference, RTI demonstrated interoperability between different native DDS applications (e.g., Unity and Epic Games Unreal Engine) and standards using DDS (e.g., FACE, ROS). The demonstration consists of two drone simulators: one using the Unity gaming engine, and the other using the Epic Unreal Engine gaming engine. These drone flight simulators publish location, video, and lidar using the Sensor Open SystemsArchitecture (SOSA) Interaction Binding and publish data through a SOSA aligned plugin. The lidar data is displayed using the RVIZ, a ROS 2 application, while location is fed into different cockpit display systems (CDSs). Figure 6 shows the architecture of the demonstration. The integration of all those components was made possible by using a common data model. New components can easily be added, and variations of the demo have been demonstrated at different trade shows.

[Figure 6 | The architecture of the TSOA-ID native DDS interoperability demo is shown.]

MOSA: No more vendor lock

MOSA recognized that to reduce cost, time to market, and to develop better, more flexible solutions, it must embrace open architectures. The era of black box and vendor-locked technologies has ended, and the industry must embrace open standards in all types of software ranging from commercial off-the-shelf to open source. MOSA is not just an innovative idea, it is a legal requirement – specifically, Title 10 U.S. Code 2446a.(b), Sec 805 of DoD Acquisition Regulations – to achieve the development of affordable, adaptable systems.

It has been a few years since the DoD Data Strategy was created, and the DoD continuously makes strides toward its goal of transforming into a datacentric agency. To support that transition, new systems also need to embrace a datacentric strategy. Open standards such as DDS will become an essential part in supporting the DoD’s transition to a datacentric world. A datacentric open standard such as DDS enables the fundamental step for designing, developing, and deploying next-generation, distributed mission-critical systems and supports the MOSA principles of achieving lower life cycle costs in deployment and maintenance, faster time to market, and improved reliability during deployment in crewed and uncrewed systems.

Andre Odermatt is a principal application engineer at Real-Time Innovations (RTI). Before joining the Products & Markets group at RTI, Andre was a field application engineer at RTI for five years, working closely with customers on distributed systems. He has been participating in the FACE Consortium Integration Workshop Standing Committee (IWS) meetings for years. As a Field Application Engineer, he supported multiple projects such as the Ship Self Defense System (SSDS), General Atomics Ground Control Station, and Boeing AWACS, among others. Andre has over 30 years of experience with embedded systems, distributed applications, and communications protocols. He is an active participant in the FACE Consortium TWG Transport Services Segment (TSS) technical working group. Andre is the chair of the FACE integration workshop (IWS) standing committee.

RTI         https://www.rti.com/en/

Featured Companies

Real Time Innovations (RTI)

232 E. Java Drive
Sunnyvale, CA 94089