Channels ▼
RSS

Design

The Non-Existent Software Crisis: Debunking the Chaos Report


What I find very interesting is how the Agile and Lean philosophies toward time, money, value, and quality are given greater importance than traditional ways of looking at those success factors. For example, when it comes to time/schedule 58% (16%+42%) want to deliver on time according to schedule, whereas 81% (39%+42%) believe in delivering when the solution is ready to be shipped. With respect to budget/ROI, 36% of respondents want to deliver on or under budget, whereas 83% want to provide the best ROI. For stakeholder value, 14% believe in building the system to specification compared with the 96% who prefer to meet the actual needs of stakeholders. When it comes to product quality, 44% want to deliver on time and on budget, whereas 90% want to deliver high-quality, easy-to-maintain systems.

The Software Crisis that Wasn't

For years, I have actively questioned the Chaos Report findings in print, in seminars, and in conference presentations. You don't need to do a detailed study as we've done here; instead, you need only take your brain out of neutral and think for yourself. If the Chaos Report's claim that we have a one third success rate at software development is true, then roughly half of organizations must have less than a one third success rate, and roughly half must have a greater than one third success rate. How many organizations have you seen with a less than one third success rate? Any? I get around, at one point in my career I was visiting 20+ organizations a year, and I've never run into one with such an abysmal track record.

Why are the results of this study noticeably more positive than what is published in the Chaos Report? It's in the way that success is defined. The Chaos Report has never measured the success rate of software development projects. Ever. What it measures is whether project teams are reasonably on time, on budget, and are building something to specification. To be fair, the Standish Group has always been very clear about this, but they unfortunately have also marketed this as if it's a valid standard measure of the success of software development teams. As our study found, only a very small minority of teams seem to use these as their success criteria.

Instead of inflicting an artificial definition of success on software development teams, this study instead asked people to rate their projects in terms of the success criteria that actually applies to them. Just as important, we also explored how people define success in the first place, and not surprisingly, found that every combination of the four categories of the criteria we asked about (in this case 24) were valued by someone: Sixteen ways to define success, not just one. Had we also looked into team morale, something I intend to do in the next study, I suspect we'd find all 32 combinations. For example, this study found that only 25% of respondents valued both being on time and within budget on software development projects, and only 8% value being on time, on budget, and to specification. By judging teams with criteria that are applicable in only one-in-twelve situations, is it any wonder that the Chaos Report's results are so far off? The claims of the existence of a "software crisis" were built on an ill-formed definition of success. Get the definition right and, poof — no more software crisis.

To understand why this misinformation is still being spread, I'll ask several rhetorical questions: Why would vendors of expensive tools want you to believe that there is a software crisis? Why would expensive software consultants plying their services want you to believe that there is a software crisis? Why would research organizations selling expensive studies want you to believe that there is a software crisis? Why would public speakers pitching their strategies want you to believe that there is a software crisis?

We may never have definitive answers to these questions, but we should at least consider them.

The good news is that it seems that the Standish Group knows that its success rate claims are problematic. It does seem to be rehabilitating itself by also publishing value-based measures in addition to their "on time, on budget, to spec" measures. Perhaps giving people a choice will help them to take their traditional blinders off, time will team. Regardless, whenever someone quotes the Chaos Report's claim that only a third of software development teams are successful, please challenge them. At some point, we really do need to stand up and "call BS."

Conclusion

The results from the 2013 IT Project Success Rates Survey, including the questions as asked, the data as answered, and my analysis are available free of charge. At Comparing Software Development Paradigms, I've shared some infographics that you're welcome to use that summarize the survey results.

You can read about our previous success rate studies at:

Finally, I should note that the Standish Group does in fact have some interesting and valuable insights to share regarding the effectiveness of various IT strategies.

February 2014 IT State of the Union Survey

I invite you to fill out the February 2014 State of the IT Union Survey. The goal of this ongoing survey series is to find out what IT professionals are actually doing in practice. The survey should take you less than five minutes to complete, and your privacy will be completely protected.

The results of this survey will be summarized in a forthcoming article. This is an open survey, so the source data (without identifying information to protect your privacy), a summary slide deck, and the original source questions will be posted so that others may analyze the data for their own purposes. Data from previous surveys has been used by university students and professors for research, and I hope the same will be true of the data from this survey. The results from several other surveys are already posted, so please feel free to take advantage of this resource.

Surveys such as this are not only a great way for you to compare your team with the rest of the industry, they're also an opportunity for you to give back to the rest of the community through sharing your experience.


Scott Ambler is a long-time contributor to Dr. Dobb's.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Video