Two trends make embedded systems more likely targets of exploitation. First, more embedded systems are using PC-based operating systems, cross-platform run-time environments, and common software components. The result is less diversity. Second, embedded systems are more likely to be network-enabled, which opens up remote attack vectors. Although there are thousands of times more embedded systems in the world than personal computers, most of the public knowledge of viruses, worms, and other exploits is focused on the latter. Embedded systems, however, are becoming a more attractive target to hackers because now these systems are pervasive, increasingly homogenized, and remotely exploitable.
The high cost of exploitation
Network security for an IT department is an exercise in risk analysis and mitigation. Software will always have defects that lead to vulnerabilities. Some of these defects will be exploited, resulting in substantial financial loss. For example, it cost companies an average of $475,000 to expel the Blaster worm.2 Larger organizations reported losses over $4 million. Clearly, the cost of exploitation can be high.
The lack of effective network security is no longer a blameless matter. The Sarbanes-Oxley Act of 2002 and the Health Insurance Portability and Accountability Act of 1996 (HIPPA) have made corporate officers personally accountable for the security of their customers' private information. Exploitation of a vulnerability could lead to leaks of personal information and is a principle reason why the creation of a new corporate position, chief information security officer (CISO), has become prevalent. The CISO is looking to keep the CEO out of jail.
At the 32nd Computer Security Institute Conference, Qualys CTO Gerhard Eschelbeck presented "The Laws of Vulnerabilities."3 Qualys gathered a statistically significant amount of field data on software vulnerabilities and how they've been exploited. Qualys analyzed this data set from 32 million live network scans and distilled it into six "laws" that can help organizations understand how vulnerable they are and how to best mitigate exploitation.
One law was particularly interesting: 90% of vulnerability exposure is caused by 10% of critical vulnerabilities. Most exploitations target a small number of vulnerabilities. What makes a vulnerability attractive for exploitation? Typically the vulnerability is widely deployed and remotely exploitable. What are the traits of software components that have frequently exploited vulnerabilities? They tend to have a large code volume and are sometimes complex to configure correctly. Many times they also have a long history of vulnerabilities and exploitations. Software complexity is sometimes described as proportional to the square of the code volume.4 The more complex the software, the more likely it will have bugs with vulnerabilities that can be exploited remotely. Beware of software components that have a large volume of code especially if you only use a fraction of the capabilities provided.
An embedded systems developer is engineering a specific product. It's one species of tree in the forest. The more distinct that product is from the others on the customer's network the more immune it will be from catastrophic exploitation. This safeguard might avoid a public relations disaster for a product company.
In general, it's far more probable that an attacker will exploit the software components common to many different devices rather than the application code specific to a single product. More devices can be exploited that way. Security is important for product-specific code. However, the choice of the underlying software components has a much larger impact on the customer's risk of exploitation.
The benefits of using operating systems such as GNU/Linux, run-time environments such as Sun's Java Runtime Environment (JRE), and common software components such as OpenSSL are enormous. But their very ubiquity and complexity have made them highly attractive targets. In many cases reasonable alternatives meet the same functional requirements. A BSD variant can, in many cases, replace a GNU/Linux system with minimal effect on the application code. Skelmir's CEE-J virtual machine is an alternative to Sun's JRE. Cryptlib is an excellent security toolkit that can be used instead of OpenSSL.
It's not always easy to use a less-ubiquitous alternative. The effects on a project's schedule can range from negligible to substantial. Embedded systems developers need to carefully consider the tradeoffs. You may need to address a combination of integration, compatibility, maintenance, and porting issues. The learning curve to master an unfamiliar software component can be daunting. The upside is that an alternative may be more efficient, have higher quality, and less complexity due to its smaller code volume. Certainly, those are traits that embedded systems developers desire for their product's firmware.
Of course, diversity is a moving target. Although it's unlikely, Cryptlib may someday be as ubiquitous as OpenSSL and attacked as frequently. Before selecting a software component, run a search for its history of vulnerabilities and exploits. Keep informed about new vulnerabilities by subscribing to security mail lists or even a vulnerability notification service. You may be surprised by what you learn.
There are other benefits to software-component diversity besides reducing the risk of exploitation. By reducing the number of vulnerabilities, engineers will upgrade software components less frequently and quality assurance will have fewer regression tests to execute. On the customer side, it reduces the number of software patches that must be rolled out to affected products. Sales can use the resulting lower cost of ownership as a selling point against their competitors.
Embedded systems developers can mitigate the exploitation risk dramatically by considering "security through diversity" when selecting software components for their products. Diversity is particularly critical for embedded systems since their software components are generally more opaque to an IT professional than those of an ordinary server. Further, the methods for upgrading most embedded systems are nonstandard and can be time consuming for the IT staff. These factors may delay the deployment of critical patches and increase the risk that unknown vulnerabilities exist on the network.
Security by diversity is not a sole-source solution to solve software vulnerability problems. However, when used as part of a comprehensive security strategy during product development, it can be a cost-effective way to mitigate potential exploitation.
Jim Higgins is vice president of Alektrona Corporation, a Rhode Island-based software, firmware, and hardware design and consulting company focused on embedded platforms that leverage the Internet and wireless network connectivity. Prior to Alektrona, Jim was architect for the device management team at American Power Conversion. He has been designing and implementing Internet-enabled embedded systems for 10 years and has been programming for 23 years. You can reach Jim at firstname.lastname@example.org.
1. Australian Department of Environment and Heritage, biodiversity web page: www.deh.gov.au/biodiversity
2. From TruSecure / ICSA Labs, August 29, 2003 as reported on Security Stats.com: www.securitystats.com/sspend.html
3. Qualys, "Research & Development/The Laws of Vulnerabilities": www.qualys.com/research/rnd/vulnlaws/
4. Geer, Daniel, Rebecca Bace, Peter Gutmann, Perry Metzger, Charles P. Pfleeger, John S. Quarterman, Bruce Schneier. "CyberInsecurity: The Cost of Monopoly How the Dominance of Microsoft's Products Poses a Risk to Security," independent report published by Computer & Communications Industry Association. Posted on Cryptome.org, September 27, 2003: http://cryptome.org/cyberinsecurity.htm