Channels ▼
RSS

C/C++

The Non-Existent Software Crisis: Debunking the Chaos Report


The software development success rate published in the Standish Group's Chaos Report is among the most commonly cited research in the IT industry. Since 1995, the Standish Group has reported rather abysmal statistics — from a rate of roughly one-in-six projects succeeding in 1995 to roughly one-in-three projects today. Ever since the Chaos Report was published, various Chicken Littles have run around warning about how this "software crisis" will lead to imminent disaster. However, this supposed "software crisis" is complete and utter hogwash, it always has been and I suspect always will be. Sadly, that doesn't stop people who should know better, or at least should be able to think for themselves, to continue quoting this nonsense.

Since 2006, I have organized an almost-annual survey for Dr. Dobb's that explores the actual success rates of software development projects. The most recent was conducted in November and December of 2013 and had 173 respondents. The original questions as asked, the source data, and my analysis can be downloaded for free at 2013 IT Project Success Rates Survey results. The survey was announced on the Dr. Dobb's site, on the Ambysoft announcements list, my Twitter feed, and several LinkedIn discussion forums. The results of this study are much more positive than what the Standish Group claims. They still leave significant room for improvement, but they certainly don't point to a crisis.

The success rates by development paradigm are summarized in Table 1. As you can see, all paradigms (even an Ad hoc approach) fare much better than a one-in-three success rate. In this study, we asked people to judge success based on criteria that was actually applicable to the team at the time. In this case, a project is considered:

  • Successful if a solution has been delivered and it met its success criteria within a range acceptable to the organization;
  • Challenged if a solution was delivered, but the team did not fully meet all of the project's success criteria within acceptable ranges (for example, the quality was fine, the project was pretty much on time, but ROI was too low);
  • Failed if the team did not deliver a solution.

Paradigm

Successful

Challenged

Failed

Lean

72%

21%

7%

Agile

64%

30%

6%

Iterative

65%

28%

7%

Ad hoc

50%

35%

15%

Traditional

49%

32%

18%

Table 1: Software development success rates by paradigm.

Each paradigm was well-defined. Respondents were first asked if their organization had any project teams following a given paradigm, and then what percentage of projects where successful, challenged, or failed. A weighted average for each of level of success was calculated. A selection of 91-100% was considered to be 95%, 81-90% as 85%, and so on. A selection of 0 was considered as 0, and answers of "Don't Know" were not counted. I then had to normalize the value because the weighted averages didn't always add up to 100%. For example, the weighted averages may have been 60%, 30%, and 20% for a total of 110%. To normalize the values, I divided by the total, to report 55% (60/110), 27% (30/110), and 18% (20/110).

Defining the Paradigms

The paradigms in this survey were defined as follows:

  • Ad hoc. On an Ad hoc software development project, the team does not follow a defined process.
  • Iterative. On an iterative software development project, the team follows a process that is organized into periods often referred to as iterations or time boxes. On any given day of the project, team members may be gathering requirements, doing design, writing code, testing, and so on. An example of an iterative process is RUP. Agile projects, which are defined as iterative projects that are performed in a highly collaborative and lightweight manner, are addressed later.
  • Agile. On an Agile software development project, the team follows an iterative process that is also lightweight, highly collaborative, self-organizing, and quality focused. Examples of Agile processes include Scrum, XP, and Disciplined Agile Delivery (DAD).
  • Traditional. On a traditional software development project, the team follows a staged process where the requirements are first identified, then the architecture/design is defined, then the coding occurs, then testing, then deployment. Traditional processes are often referred to as "waterfall" or simply "serial" processes.
  • Lean. Lean is a label applied to a customer-value-focused mindset/philosophy. A Lean process continuously strives to optimize value to the end customer, while minimizing waste (which may be measured in terms of time, quality, and cost). Ultimately, the Lean journey is the development of a learning organization. Examples of Lean methods/processes include Kanban and Scrumban.

Ad hoc development (no defined process) and the Traditional approach had statistically the same levels of success in practice. Our previous studies have also shown this result. However, when you take team size into account, Ad hoc is superior to Traditional strategies for small teams, yet the opposite is true for large teams. Agile and Iterative strategies had similar results on average, which we've also found to be true in the past, regardless of team size. For the first time, Lean strategies for software development, supported by both Kanban and the DAD framework, were notably more successful than the other four paradigms we explored. Food for thought.

Time to Get Lean?

For each paradigm, we also looked at several potential success factors. The questions we asked were:

  • Time/Schedule. When it comes to time/schedule, what is your experience regarding the effectiveness of [paradigm] software development teams?
  • Budget/ROI. When it comes to effective use of return on investment (ROI), what is your experience regarding the effectiveness of [paradigm] software development teams?
  • Stakeholder Value. When it comes to ability to deliver a solution that meets the actual needs of its stakeholders, what is your experience regarding the effectiveness of [paradigm] software development teams?
  • Product Quality. When it comes to the quality of the system delivered, what is your experience regarding the effectiveness of [paradigm] software development teams?

For each success factor, respondents were given options of Very Effective, Effective, Neutral, Ineffective, Very Ineffective, and Not Applicable. For each question, we calculated a weighted average by multiplying the answers by 10, 5, 0, -5, and 10 respectively (answers of "Not Applicable" were ignored). By doing so, we're now able to compare the relative effectiveness of each paradigm by success factor, which you can see in Table 2. It's interesting to note that Lean approaches were perceived to provide better results on average than the other paradigms. Also, the three development paradigms of Lean, Agile, and Iterative were significantly better across all four success factors than either Ad hoc or Traditional (waterfall). In the vast majority of cases, only cultural inertia can justify a Traditional approach to software development these days.

Paradigm

Time/Schedule

Budget/ROI

Stakeholder Value

Product Quality

Lean

5.7

6.0

5.5

4.8

Agile

4.3

4.2

5.6

3.8

Iterative

4.9

4.2

5.6

3.8

Ad hoc

0.0

-0.4

1.9

-1.4

Traditional

-0.7

0.5

0.1

1.9

Table 2: The effectiveness of each software development paradigm.

How Is Success Defined?

The survey also explored how people define success, which I argue is exactly where the Chaos Report falls apart with respect to measuring project success rates.  Similar to what I found in the 2010 study, there was a robust range of opinions when it came to defining success:

  • Time/schedule: 16% prefer to deliver on time according to the schedule, 39% prefer to deliver when the system is ready to be shipped, and 42% say both are equally important
  • Budget/ROI: 13% prefer to deliver within budget, 60% prefer to provide good return on investment (ROI), and 23% say both are equally important
  • Stakeholder value: 4% prefer to build the system to specification, 86% prefer to meet the actual needs of stakeholders, and 10% say both are equally important
  • Product quality: 10% prefer to deliver on time and on budget; 56% prefer to deliver high-quality, easy-to-maintain systems; and 34% say both are equally important


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Video