Channels ▼

How Much Proof Do You Need?

February 2002: Agile Modeling

Adopting and then tailoring a software process to meet your team's needs is an important and difficult decision, one that is critical to your project's success. You face a myriad of choices, ranging from prescriptive methods such as the Rational Unified Process (RUP) and the OPEN process to agile software methods such as Extreme Programming (XP), SCRUM and Dynamic System Development Methodology (DSDM). A wealth of material has been written about these methodologies, but how can you tell whether a methodology will work for you? In particular, how do you determine whether an agile development methodology will work in your environment?

If you spend any time online reading process-oriented resources, such as the XP mailing list hosted by Yahoo! groups or the comp.object newsgroup, you've probably noticed the common refrain of the detractors of agile methodologies: "Where's the proof?" On the surface, this seems to be a reasonable question, because many developers want assurances that these new approaches do, in fact, work. However, I feel that these people are barking up the wrong tree for several reasons:

The proof-hunters should remember that, first and foremost, agile software development methodologies are new: 1. SCRUM and DSDM, defined in the mid-1990s, are among the oldest agile methods. 2. XP, arguably the most popular of the agile processes, was first described in the late 1990s as a collection of process/organizational patterns, and in 2000 was published as a book. 3. The Agile Alliance was loosely formed in the spring of 2001. 4. Agile Modeling (AM) was first defined in the fall of 2000, and will be published as a book in late February of 2002.

Because agile methods are so new, there simply hasn't been sufficient time to prove that they work in a wide variety of situations. Yes, each methodology has excellent anecdotal evidence of its efficiency, but statistical proof provided by a detailed study doesn't yet exist, and probably won't for several more years. Some research results—for example, the effectiveness of pair programming—have been examined in detail, and the elements of iterative and incremental approaches are reasonably well known. However, we're going to have to wait a while for metrics concerning agile software development methods.

From Laggards to Innovators
Despite the attraction of "documented" efficiency, it's not realistic to require "proof" that a method works before taking the plunge. In his book, Crossing the Chasm (Harper Business, 1999), noted management consultant Geoffrey Moore describes five types of technology adopters: "Innovators," who pursue new concepts aggressively; "early adopters," who pursue new concepts very early in the lifecycle; the "early majority," who wait and see before buying into a new concept; the "late majority," who are concerned about their ability to handle a new concept should they adopt it; and "laggards," who simply don't want anything to do with new approaches. People who fit the innovator or early adopter profile are comfortable with reading a Web page or book describing agile techniques; they'll think about the new concepts and then tailor them to their environment. This is the lifecycle stage where most agile techniques currently reside, and these types of organizations are most readily adopting these new techniques. The early majority is waiting for sufficient anecdotal evidence before making the jump to agile software development. When such proof is presented at conferences such as XP 2002 and XP/Agile Universe, I suspect that more of these groups will take the leap over the next year or so. Unfortunately, the question "Where is the proof?" is typically asked by organizations that fit the late majority or even laggard profiles. Because agile techniques have clearly not yet reached that stage, I believe that this question simply isn't a fair one. However, though they're never at the "bleeding edge" of innovation, many organizations have great success with these "watch and wait" business models.

Setting the Bar
I'm afraid that we've been spoiled: Previous work in the software metrics field may have set the "proof bar" too high. In his book, Software Assessments, Benchmarks, and Best Practices (Addison-Wesley, 2000), Capers Jones, chairman of Software Productivity Research, presents a plethora of information pertaining to software development techniques and technologies, gathered over decades. In Table 5.4 of that book, Jones documents the effectiveness of various development techniques. With a 350% adjustment factor, reuse of high-quality deliverables is rated the "most effective" technique, whereas quality estimating tools provide a 19% adjustment factor, and use of formal inspections garners a 15% adjustment factor. Table 5.5 goes on to list negative adjustment factors. For example, crowded office space earns a -27% adjustment factor, no client participation a -13% adjustment factor, and geographic separation a -24% adjustment factor.

As these statistics indicate, current best and worst practices have been examined in detail, but the results of newly proposed techniques such as refactoring and co-location with customers have not yet been adequately analyzed. To compound this problem, the majority of agile methodologists are practitioners who are actively working on software projects and who presumably have little time to invest in theoretical study. It may be a long time until we see agile studies comparable to Jones' analyses of traditional techniques.

What One Needs to Know
It may not be clear what actually remains to be proven. In Chapter 3 of his upcoming book, Agile Software Development Ecosystems (Addison- Wesley, 2002), Jim Highsmith observes: "Agile approaches excel in volatile environments in which conformance to plans made months in advance is a poor measure of success. If agility is important, then one of the characteristics we should be measuring is that agility. Traditional measures of success emphasize conformance to predictions (plans). Agility emphasizes responsiveness to change. So there is a conflict because managers and executives say that they want flexibility, but then they still measure success based on conformance to plans. Wider adoption of agile approaches will be deterred if we try to 'prove' that agile approaches are better using only traditional measures of success."

Perhaps the proof-hunters' motivations aren't appropriate. Are they really interested in finding an effective process, or are they merely looking for a reason to disparage an approach that they aren't comfortable with? Are they realistic enough to recognize that no software process is perfect—that there is no silver bullet to be found? Are they really interested in proof that something works, or simply seeking an assurance of perceived safety?

Though currently, significant anecdotal evidence indicates that agile software development techniques work, there's very little statistically valid proof—nor will this situation change anytime soon. If anecdotal evidence isn't sufficient, agile software development processes aren't for you ... yet!

I'd like to thank David M. Rubin, Tim Bond, Dale Emery, Remy Fannader, Adam Geras, Michael Krige, Rob Lineberger, Ed Manley and Dave Thomas for their thoughtful comments regarding this issue posted on the Agile Modeling mailing list.

Recently on the Agile Modeling Mailing List:

Package diagrams. A discussion ensued when someone pointed out that package diagrams were not included in my list of modeling artifacts for agile modelers. Object modelers commonly create what are called package diagrams—a UML diagram that consists only of packages—to serve as high-level diagrams for their system. The two most commonly employed diagrams are UML class diagrams and UML use case diagrams, although UML deployment diagrams could be used as well. A package can be applied on any UML diagram to gather similar elements together.

Class Type Architecture. A discussion about the five-layer class-type architecture began after I posted a white paper on the subject. The five layers are User Interface, Controller/Process, Business/Domain, Persistence and System. The discussion focused on the importance of the separations of concern within your design, robustness and performance. A side discussion started pertaining to the modeling implications of each layer, which motivated me to add a new table to the white paper listing suggested types of modeling artifacts for each layer. As you would expect, there was a difference between the layers, suggesting the need for people to use different skill sets to work on each layer.

The Next Level of Abstraction. A heated discussion regarding the modeling concept of abstraction began over the New Year holiday when someone questioned what the next level of abstraction would be from high-level programming languages such as Java and Visual Basic in current use. Previous contributors had posited that the Unified Modeling Language (UML) and Model Driven Architecture (MDA) promoted by the Object Management Group (OMG) didn't raise the level of abstraction from programming languages as much as did C over Assembly Language in the past. The use of UML for requirements, analysis and design was discussed, as was the concept of abstraction for all three.


Agile Alliance Home Page.

Agile Modeling Mailing List.

Artifacts for Agile Modeling: The UML and Beyond.
This page overviews a collection of modeling artifacts, indicating effective uses for them for the purposes of agile modeling.

Crossing the Chasm 2/e by Geoffrey A. Moore, eBook edition.
This book is also available in printed form at

DSDM Consortium.
The DSDM Consortium is a nonprofit organization dedicated to defining, promoting and continuously evolving its de facto worldwide standard for developing business solutions within tight timeframes. DSDM provides a nonproprietary approach for ensuring that solutions incrementally meet the needs of the business.

Kirk Knoernschild's Principles and Patterns Page.
At this page, you'll find several white papers describing basic object design techniques, including those pertaining to package design.

The OPEN Web Site.

Software Productivity Research.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.