Safety or security? An embedded wake-up call from Toyota's misfortunesStory
April 26, 2010
First, a Camry allegedly "surged" sans any driver input. Then consumer complaints emerged concerning the Corolla, Prius, and even some Lexus models. All this makes one thing clear: Toyota?s drive-by-wire vehicles - and all automobiles - should be regarded as safety-critical systems, just as any Boeing 777 would be. And this fact deserves particular attention from embedded designers.
It started with the Camry that allegedly “surged” with no apparent accelerator input from the driver. Before long, the media reported customer complaints against many Toyota products from Corollas to the vanguard Prius and even some Lexus models. As we went to press, the only things certain about Toyota’s drive-by-wire vehicles are that the Feds are poking around, Toyota’s lost face (and revenue), and digital electronics in vehicles need to be thought of the same way as avionics systems controlling a Boeing 777 at 35,000 feet. Safety-critical systems – which certainly ought to include automobiles – demand special attention by embedded developers.
I’ve got my own special story to tell about how software in a vehicle can ruin one’s day, and damn near killed me. I owned a 2003 Chevy Silverado pickup with all the digital toys. The memory feature on my electric seat frequently went berserk and motored me all the way forward, often unexpectedly smashed against the steering wheel at 60 mph. Turns out a software fix for the body control module was needed, except GM engineers were busy doing other things for a full 12 months. In the meantime, I yanked the wire (not the fuse, which would’ve disconnected the mirrors and other essential electrics). It’s estimated that several hundred microcontrollers or microprocessors are found in every modern vehicle. When code goes haywire, does the vehicle end up in a ditch – or worse?
On the other hand, with so much software controlling life’s everyday conveniences like cars, cell phones, medical monitors, and the world’s banking systems atop the Internet, have we yet come to grips with embedded security? Worrying about the latest DDoS attack on a server is commonplace – just ask the political leadership of Estonia how their banking system was shut down by foreign hackers. But you should also be terrified at what a bad actor could do with access to your life via an iPhone – or your automobile.
According to Robert Dewar, Emeritus Professor of Computer Science at New York University and President of AdaCore software, there’s a big difference between safety and security. A secure embedded system needs to have assurance of meeting security constraints per an established profile, while a safety-critical system needs to always work. For example, the NSA’s Tokeneer Project established the requirements for a secure biometric door entry system, and used COTS software with formal methods to establish and prove compliance to the security profile (details, including all the code written in SPARK Ada: www.adacore.com/tokeneer).
In the defense industry, myriad standards exist for developing secure software and systems, including separation kernels in operating systems, Multiple Independent Levels of Security (MILS), and time- and space-partitioned environments per ARINC-653, to name just a few. But are these ever used in commercial applications? Doubtful.
Similarly, for safety-critical systems such as avionics, the FAA’s DO-178B (software) and DO-254 (hardware) establish criticality levels and formal proof methods to absolutely guarantee that software and systems behave 100 percent predictably. It also follows that every safety-critical system must be secure, because unauthorized access might compromise safety.
My hope would be that secure or safety-critical applications – including smartphones and automobiles – would consider some of our industry’s COTS products. I’m sure Wind River or Green Hills will call me and tell me how VxWorks-178 or INTEGRITY-178 is used in a non-military application, but we rarely hear about these installations. It’s also likely that the heavy requirements management, formal methods, and NIAP certifications from our world might be “over-kill” in civilian apps.
Still, there’s hope. Open Kernel Labs, a company I first became familiar with last year, sells OS kernels for high-volume, low-cost platforms including cell phones. Their OKL4 kernel has doubled in volume from 250 million in 2009 to an estimated 500 million handsets for 2010. But it’s not just any old kernel – OKL4 creates “secure cells” into which handset applications can run. If one app dies, or goes rogue, it can’t spill over or affect the other cells. This is 100 percent identical to the original Green Hills concept of a “padded cell,” which coincidentally is the basis for ARINC-653 and other partitioned military environments. Interestingly, OK Labs is unfamiliar with the DoD concepts we all are so familiar with – but the company is clearly following the same path.
As we went to press, OK Labs was scheduled to announce a secure VoIP application running in a secure “hypercell” on a single-core cell phone processor, virtualizing the applications while keeping them separate. This follows the company’s Nirvana phone concept – another partitioned application that extends cloud computing to the handset. I am hopeful that as OKL4’s security gains traction, the commercial world will wake up to the risks and take a glance at our market and find some inspiration.
But on the safety-critical side, I’m unaware of any U.S.-based civilian efforts mirroring DO-178B. I suspect if anything’s happening, it probably will emerge out of Europe’s mass transit industry. (If you know of civilian safety-critical activities, drop me a line at [email protected] and I’ll be sure to print it.)
Safety and security are both hugely complicated subjects and ones we routinely cover in Military Embedded Systems. Yet ample standards, resources, and tools exist ranging from MILS to DO-178B, to ARINC-653 and DO-254, all the way to static analysis and code verification suites. Any of these could improve tomorrow’s generic, nondefense embedded systems. With Intel’s goal of “15 billion mobile Internet devices” by 2015 – including automobiles – how much longer can we ignore the issues of security and safety?