It's the Cinnabons, Stupid!

A few days ago, on his flight from the Shenandoah Valley to the Software Development West Conference and Expo in Santa Clara, Calif., security expert Gary McGraw went through the usual security precautions: X-rays of his shoes and luggage, a bomb-check of his laptop and a pat-down. Once on the plane, however, a late passenger rushed in just before the doors closed, proffering a bulging bag of Cinnabons to the flight crew. Gratefully,the entire crew—including the pilots—ate the buns. “I thought, man, that’s the perfect attack: Take out the entire crew with poison Cinnabons,” laughed McGrawin his March 24 keynote speech. “Security is tricky.


March 26, 2003
URL:http://www.drdobbs.com/its-the-cinnabons/its-the-cinnabons-stupid/184415939

Untitled Document

It’s the Cinnabons, Stupid!

SD opening keynote links secure software and perilous pastry.

By Alexandra Weber Morales
March 26, 2003

A few days ago, on his flight from the Shenandoah Valley to the Software Development West Conference and Expo in Santa Clara, Calif., security expert Gary McGraw went through the usual security precautions: X-rays of his shoes and luggage, a bomb-check of his laptop and a pat-down. Once on the plane, however, a late passenger rushed in just before the doors closed, proffering a bulging bag of Cinnabons to the flight crew. Gratefully, the entire crew—including the pilots—ate the buns. “I thought, man, that’s the perfect attack: Take out the entire crew with poison Cinnabons,” laughed McGraw in his March 24 keynote speech. “Security is tricky. Bad guys do surprising things all the time. That’s why you have to know what you’re assuming and what happens when what you’re assuming goes away.”

He may be paranoid, but “paranoia is where security guys live,” McGraw admitted. A fundamental mismatch exists between developer goals and security goals, he explained. “Functionality is the number one driver for software developers. Usability is number two. Security is a hassle for them. They also want efficiency, time to market and simplicity.”

On the other hand, McGraw added, security goals include prevention, traceability and auditing, monitoring, privacy and confidentiality, multilevel security, anonymity, authentication and integrity.

The “trinity of trouble,” according to McGraw, CTO of security analysis firm Cigital Inc., is:

  • Connectivity: The Internet is everywhere, and everything is on it.
  • Extensibility: Systems evolve in unexpected ways and are changed on-the-fly.
  • Complexity: Networked, distributed mobile code is hard.

    “Remember the Y2K bug? None of those guys who wrote that Cobol code expected it to run for 30 years!” “That’s job security,” heckled an audience member. “You’re right,” McGraw responded. “If you want a job tomorrow, code yourself up some bugs today.”

    Leaving no technology unscathed, McGraw pierced the belief in a single safe platform—though some are better than others. “C is assembly language on steroids. Read Kernighan and Ritchie on input, Chapter 7, page 164. Don’t do it that way —the sure way to get a buffer overflow—and that’s the bible! When you’re choosing a language, use a typesafe language that controls memory in a way that was invented in 1959.” Java isn’t perfect, though: “Java’s class-loading architecture is flawed and it’s about to change—but please use Java. It beats the heck out of C.”

    When it comes to operating systems, “No single one is good. And using an open-source OS is not the key to security, because no OS was designed for security,” he said. What about Web services? “SOAP is an antisecurity device,” he admonished the audience.

    But “avoiding stupid languages,” in McGraw’s parlance, won’t eliminate the problems. “Software security is currently nobody’s job,” he said. In addition to technological and sociological flaws, many bugs in implementation and architecture are avoidable. Buffer overflows, failing open rather than safe, dangerous environment variables, unsafe system calls and untrusted input are among the common coding errors. Architectural mishaps include the misuse of cryptography, catastrophic security failures and type safety confusion.

    If software development is security’s stepchild, architectural analysis is its second cousin thrice removed. To bring architecture back into the nuclear family, McGraw advises drafting a one-page design model, using hypothesis testing to categorize risks, brainstorming attack patterns, ranking risks and tying them to the business context. Finally, propose fixes and repeat.

    To think like an attacker, developers should test security functionality, cover nonfunctional requirements, and do risk-based testing and, possibly, informed red-teaming.

    Ultimately, much in security depends on the concept of trust. “Keep trust to yourself,” advised McGraw. “Error messages sometimes give away way too much information. Error reporting is not a debugging tool; use logging for that.” Clever attackers often combine error reporting and untrusted input to explore possible ways to enter a system. Misinformation and honey pots can help in these situations, though. “Have your Linux box lie and say it’s a BSD box, then sit back and watch all the BSD hacks. This is good until you forget what your box actually is.”

    The most important point? Unlike his hapless, hungry flight crew, “Assume nothing.” Most importantly, “Don’t trust security people. If they’re giving a talk on security, it’s all lies,” McGraw laughed. The field’s quickly becoming not only smart, but lucrative, as users finally begin to demand flawless security from their feature-packed software.

    Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.