Net-centric security and CWEStory
September 04, 2012
The Common Weakness Enumeration (CWE) lists the common-mode failures that have led to security breaches in numerous software systems. It can be used to help improve the robustness of critical networks and infrastructure to help thwart cyber attacks.
Net-centric warfare uses a system of networks to share information within a combat theatre. This advanced communications network enhances situational awareness with the aim of improving mission effectiveness.
The network that underpins such a system has the potential to expose a significant attack surface to the enemy, raising significant security concerns. After looking at what measures need to be adopted to ensure secure systems, the following examines the Common Weakness and Enumeration (CWE) and demonstrates how it can be used to enhance security in battle communications.
Internal networks are often targeted as a means of gaining access to confidential information. In 2008 classified and unclassified systems within a U.S. Military Central Command network were found to have been compromised. Investigations showed that a military laptop was infected by a portable USB drive. This infection then spread through network connections to secure areas and is believed to have been used to transfer significant amounts of data to a third party. Infected machines are still being found four years after the initial attack.
External attacks are also attempted.For example, network vulnerabilities within one or more contractors working on the Joint Strike Fighter were exploited to gain access to sensitive project data. The attack appears to have started in 2007, but was not detected until 2009. The attack comprised installation of sophisticated spyware within the development environment. The spyware was used to transfer terabytes of data to a third party. The exact nature of the compromise is unknown as the data was heavily encrypted before being sent.
It is not always easy to understand why a system has been attacked, and it is possible that some attacks are accidental. The systems used to control the Predator and Reaper drone fleet were recently found to have been infected with a virus containing a key-logger payload. The key-logger recorded the actions of drone pilots while on active service but did not affect system functionality. It appears that no data was lost, though this might simply be because of the lack of exploitable external network connection. The network infection is proving hard to eradicate and has been found to have spread to classified and unclassified systems. It is thought that the virus was unintentionally introduced by a portable USB drive used to transfer map and other data into the control system.
Loss of sensitive information is not the only possible outcome of an attack. A virus detected within a military air traffic control system has the potential to allow a third party to render radar data untrustworthy, leading to confusion or asset loss.
Many of these security concerns arise because of device interconnection within a system-of-systems. It is hard for an attacker to exploit systems that are operated in isolation. However, if they are networked, even if intermittently, the network allows many other systems to be attacked. If they are all based on the same technology, then common security vulnerabilities can be exploited to allow rapid dissemination of malware.
The vulnerabilities exploited are generally related to code implementation or requirement errors. For example, a buffer overrun event triggered by invalid network data might be used to trick a system into running arbitrary code injected by an attacker. According to research by the National Institute of Security Technology (NIST), 64 percent of software vulnerabilities stem from programming errors.
CWE is a strategic software assurance initiative run by the MITRE Corporation under a U.S. federal grant, cosponsored by the National Cyber Security Division of the U.S. Department of Homeland Security. It lists the programming errors that have led to security failures within systems with the aim of improving the software assurance and review processes used to ensure connected devices are secure. Enumeration of the vulnerabilities in this way allows coding standards to be defined to target them so that they can be eliminated during development.
The CWE database
The CWE database contains information on security weaknesses that have been proven to lead to exploitable vulnerabilities. These weaknesses could be at the infrastructure level (for example, a poorly configured network and/or security appliance), policy and procedure level (for example, sharing usernames and/or passwords), or coding level (for example, failing to validate data). The CWE database holds information on actual exploits, not theoretical, and so only captures those coding weaknesses that have been exploited in the field.
Benefits of CWE compatibility
CWE should be used within the development environment to ensure that known vulnerabilities are not introduced into the software. Many of the issues that have been identified are amenable to automatic detection by static and/or dynamic checking tools. To obtain maximum benefit, such tools should be used as early as possible in the development process, as trying to add security in at the last minute is very unlikely to succeed. The adoption of other tool-enforced security standards, such as the CERT-C Secure Coding Standard, compliments this objective and enhances the security characteristics even further.
Ensuring system security
Many security vulnerabilities can be traced to coding errors or architectural flaws and are generally hard and/or expensive to fix once a system has been deployed. Unfortunately, many developers are only interested in the development and testing of core application functionality. Security is rarely tested with the same rigor.
The security of a system needs to be considered one of the most important attributes of a system. Security requirements need to be included up front in the system design and implemented during normal development if the final system is to be secure. CWE can be used to help in the identification of appropriate high-level security requirements.
Figure 1 illustrates the attributes associated with system quality. By focusing on these measures at all phases of the software development life cycle, developers can help eliminate known weaknesses.
Figure 1: System quality is determined by many attributes, including those relating to security.
(Click graphic to zoom)
To prevent the introduction of security vulnerabilities, a development team needs to have a common understanding of the security goals and approaches to be taken during development. This should include an assessment of the security risks and the establishment of the secure coding practices that are to be used. Once again, CWE can help during coding as it highlights the constructs that have led to security compromises in other systems, reminding developers where they need to take extra care during implementation.
The risk assessment determines the quantitative and qualitative security risk for the various system components in relation to a concrete situation and recognized threat. This information is used to reduce security vulnerabilities in the areas that will have a high impact if their security is breached. The assessment results in the development of a set of security control and mitigation strategies that will form the core of the system security requirements.
These security requirements become part of the same development process used for all other requirements. Detailed at the outset, the security requirements are then traced through the design, coding, and testing stages to ensure fulfillment of the initial requirements. These linkages form documentation that demonstrates how the final system meets the security objectives laid down at the beginning.
CWE: Not a coding standard
CWE is a “do not get caught by” list and is not an actual coding standard. However, coding standards can be used in complement to ensure that the CWE issues are not present in a project. Compliance with these standards helps ensure that project security goals are achieved, especially as many security issues result directly from the coding errors that they target. Additionally, compliance with a recognized standard helps to demonstrate that contractual security obligations have been met.
Compliance with the chosen coding standard (or standards) should be a formal process (ideally tool-assisted, but manual is also possible), as it is virtually impossible for a programming team to follow all the rules and guidelines throughout the entire code base.
Adherence to the standards is a useful metric to apply when determining code quality.
Static and dynamic testing should be considered essential practices. Static analysis tools confirming CWE compatibility systematically enforce the standard across all code. Dynamic analysis assures that the code does not contain runtime errors, including those that could be exploited to compromise security.
If a claim is to be made that a system complies with a security standard like CWE, then evidence must be provided to support that claim. Traceability [which makes it possible to show which test result(s) prove that a particular security requirement has been met] from requirements to the design, verification plan, and resulting test artifacts can be used to support such a claim.
Figure 2 illustrates how traceability can be linked back to requirements, and the related test cases. Such graphical representation makes it easy for developers to immediately spot unnecessary functionality (code with no requirement), unimplemented requirements, and failed or missing test cases.
Figure 2: LDRA TBmanager enables users to view traceability to source code for individual requirements and test cases.
(Click graphic to zoom by 1.9x)
Adoption of a security standard that targets the CWE vulnerabilities allows security quality attributes to be specified for a project. Incorporation of security attributes into the system requirements means that they can then be measured and verified before a system is integrated into a network, significantly reducing the potential for in-the-field exploitation of latent security vulnerabilities by the enemy.
The use of a qualified and well-integrated Application Life-cycle Management (ALM) tool to automate testing, collation of process artifacts, and requirements traceability dramatically reduces the resources needed to produce the documentation required by certification bodies. It minimizes the workload for developers and allows managers to efficiently track progress.
It is clear that system developers need to rethink their assumptions if net-centric warfare systems are to be secured against information leaks and remote manipulation. Leveraging the knowledge contained within CWE and choosing to develop and test software with the aid of CWE-aware tools represent significant steps forward. Companies that incorporate CWE and embark on a process of continual improvement help ensure that only dependable, trustworthy, extensible, and secure systems are delivered to those who put their lives on the line to protect our countries.
The CWE list and further information on CWE are available on the MITRE website at http://cwe.mitre.org.
Chris Tapp is a Field Applications Engineer at LDRA with more than 20 years’ experience in embedded software development. He graduated from the University of Durham in 1987 and has spent most of his career working within the automotive, industrial control, and information technology industries, mainly as a self-employed consultant. He is chairman of the MISRA C++ working group and an active member of the MISRA C working group. He joined LDRA in 2007 and specializes in programming standards. Chris may be reached at [email protected].
LDRA 650-583-8880 www.ldra.com