Military Embedded Systems

Software warranties -- A new era?


September 23, 2009

When anyone buys a television, a car, or even a ballpoint pen, the purchaser expects it to work properly. This assumption is backed by the legal machinery of warranties and consumer protection legislation. So software carries the same protection, right? Wrong! Historically, commercial software products have been sold with absolutely no claims that they are fit for purpose or free of vulnerabilities. But that might be changing, thanks to several factors.

For a high-ticket item like a television, an explicit warranty covers repair and replacement for a limited period. So, if a new TV is turned on and the picture has no reds, the consumer has legal recourses. And even when there is no written warranty, there are typically consumer protection laws in place that give the purchaser some rights.

However, by tradition, software is sold without any real guarantees. If there is any guarantee at all, it is generally limited to assuring that the media is uncorrupted and that the program can successfully be read. Not only that, many disclaimers say, in effect, that users can’t expect the software to work reliably.

And typical software most certainly does live down to users’ low expectations. The word “virus” suggests some kind of biological agent that users can fight against, but can’t fully expect to insulate themselves from. Thus, the idea that it is an unacceptable technical flaw for an operating system to be vulnerable to such attacks is a foreign concept to both manufacturers and users of such systems.

Software reliability: Is it possible?

So why do we accept this state of affairs? One viewpoint is that software is so complex that it is impossible to guarantee reliability; thus, it is “normal” and “reasonable” for big systems to be full of bugs (see Figure 1). In a legal case in which I testified as an expert witness years ago (580 F.Supp.474. Selden vs. Honeywell), Honeywell was being sued on the grounds that it had allegedly committed fraud by distributing an operating system so full of bugs that it could not be called an operating system. In this particular case, the software in question was in beta test and was explicitly not covered by a warranty. But the allegation of fraud was legally novel in a software context.


Figure 1: One viewpoint is that software is so complex that it is impossible to guarantee reliability; thus, it is “normal” and “reasonable” for big systems to be full of bugs.

(Click graphic to zoom by 1.8x)




I testified on Honeywell’s behalf, along the lines of: “Judge, we know that there were lots of bugs, but that’s normal. It’s ‘industry standard practice.’” The judge was not convinced and eventually ruled against Honeywell, but the case did not rise high in the court hierarchy so it had limited industry effect. “Industry standard practice” is indeed a defense, but judges are free to decide that a practice is unacceptable, and perhaps we need more judges like the one in the Honeywell case who just refused to understand our “reasonable” position.

The idea that all big software always has serious bugs is a common one, often, for example, expressed by faculty members in computer science as well as other industry experts. I recently attended a talk by a leading law professor in the field of product liability, who argued that the industry needs a different legal standard for software when it comes to product safety and warranties, since clearly software can’t be held to the same standards as other goods in the marketplace. After all, everyone knows it is technically impossible to produce reliable software.

But is this really true? Are we really incapable of producing software that works reliably? The answer is very clearly no. We do, in fact, know how to write software that is remarkably reliable. Take the case of modern aircraft, controlled by sophisticated, large-scale software (see Figure 2). If there is a serious bug in such software, planes could crash and people could die. Yet, thanks in part to the requirements imposed by the commercial avionics certification standard DO-178B, there has never been a death on a commercial airliner that could be attributed to a software error.


Figure 2: Thanks in part to the requirements imposed by the commercial avionics certification standard DO-178B, there has never been a death on a commercial airliner that could be attributed to a software error.




I am not saying that software certified against DO-178B is perfect. In fact, there have been some close calls due to errors (for example, the incident with a Malaysian Airlines jet in 2005 due to a bug in an upgrade of the air data inertial reference unit software). Still, nothing fatal, and software is definitely doing better than hardware when it comes to making planes reliable. Furthermore, it is clear that such software is covered by warranties. If a plane crashes, Boeing or Airbus is not going to get away with saying, “We’re very sorry, but we are not liable. We have determined the crash was due to buggy software, and if you consult page xxx of the contract, you will see that we disclaim all software bugs.”

Its successful record on the commercial side is one of the reasons that DO-178B is being applied to military systems. As an example, critical systems on the USAF’s C-130 AMP aircraft have been certified to Level A, the most demanding level of DO-178B. And clearly other kinds of military systems, for example weapons control, must function correctly or the consequences could be catastrophic.

Software reliability: Is it practical?

Apparently, the technology for producing reliable software exists. Therefore, the question is: Why don’t we use it more extensively?

The answers heard from many software vendors reflect the inertia of a long-established industry. What sells a typical commercial software product are new features, low price, and better performance, while reliability has lower priority. So vendors offer a variety of reasons for treating software differently from (and thus not needing the same warranty protection as) durable goods: The added expense would mean fewer and less frequent products/releases and thus less choice for users. Warranting reliability could also compromise intellectual property, and correctness depends on the environment in which the software is used and third-party components that the vendor cannot control.

As Yogi Berra might put it, some of these arguments are déjà vu all over again. Years ago, the automobile industry fought against the requirements for seat belts, then air bags for similar reasons – including manufacturing expense and fear of liability if the equipment malfunctioned – until these safety features were legally mandated. Then somehow the industry solved these issues.

Vendors are correct that it is more expensive to warrant software reliability, but only if one tries to achieve reliability à posteriori by fixing bugs until an acceptable tolerance level is achieved. If instead one designs the product by considering reliability (and safety and security) up front, then uses appropriate development tools and programming languages, the proposition is considerably more cost effective. When judged over its full life cycle, software can cost less to develop (and maintain and extend) because of, and not despite, its reliability.

And as for the argument that software is essentially different from durable goods and thus should not be subject to similar warranty protection, it is ironic that software vendors are making the opposite point when they apply for software patents.

The claims that reliability need not come at the expense of development cost and that warranty protection for software is practical and appropriate are not just theoretical. Praxis High Integrity Systems, a UK company specializing in safety-critical and high-security software, demonstrated in the NSA-sponsored Tokeneer project that reliable and ultra-secure systems can be developed in a cost-effective manner. And Praxis stands behind its software by offering a warranty to fix any bugs for free. Admittedly, this is far from “industry standard practice.” Rod Chapman of Praxis told me that at one meeting, people actually laughed when he suggested that this be done more widely.

Their laughter, however, might be short lived. Invisible market forces rarely succeed in bringing consumer-oriented protection to an industry, hence initiatives such as the European Commission’s proposal to extend product warranty coverage to encompass software.

Time for a change

Many years ago, I was frustrated that the judge did not understand “industry standard practice.” Now looking back, I think he had the right idea. Yes, of course if we insist on software working, it will mean that new features don’t get implemented quite so quickly, but think about it: Wouldn’t you prefer a product that always worked 100 percent reliably, even if it had a few less-fancy features? It’s time to end the exemption that software makers seem to feel is reasonable. The question is not whether we can afford to update our attitudes and policies on software warranties, but rather: How can we afford not to?

Dr. Robert Dewar is cofounder, president, and CEO of AdaCore; he has also had a distinguished career as a professor of Computer Science at the Courant Institute of New York University. He has been involved with the Ada programming language since its inception in the early 1980s. As codirector of both the Ada-Ed and GNAT projects, Robert led the NYU team that developed the first validated Ada compiler, and he is a principal architect of AdaCore’s GNAT Ada technology. Robert has also served as an expert witness in several federal trials, and has testified in cases involving copyrights, patents, and software contracts. He can be reached at [email protected].

AdaCore 212-620-7300


Featured Companies


150 W. 30th Street, 16th floor
New York, NY 10001