Channels ▼


How Successful Are IT Projects, Really?

How is Success Actually Defined?

The survey once again explored how people define success, although we evolved the questions this year to be more inclusive. In previous years, for each success criterion, we asked people to choose between two choices. For time to market, for example, we asked whether they preferred to deliver on time according to the schedule, or to deliver when the solution is actually ready to be shipped. This year, we gave the same choices, but added a third option to indicate that both were equally important. I made this change based on feedback from several readers who felt the former binary question wasn't fair.

Here's how respondents, on average, define success:

  1. Time/schedule: 20% prefer to deliver on time according to the schedule, 26% prefer to deliver when the system is ready to be shipped, and 51% say both are equally important.
  2. Return on investment (ROI): 15% prefer to deliver within budget, 60% prefer to provide good return on investment (ROI), and 25% say both are equally important.
  3. Stakeholder value: 4% prefer to build the system to specification, 80% prefer to meet the actual needs of stakeholders, and 16% say both are equally important.
  4. Quality: 4% prefer to deliver on time and on budget, 57% prefer to deliver high-quality systems that are easy to maintain, and 40% say both are equally important.

My experience is that you should prefer to ship something only when it's ready (as did 77% of respondents), to provide a good ROI (85% of respondents), meet the actual needs of stakeholders (96% of respondents), and deliver high-quality systems (97% of respondents). Of the survey respondents, a full 67% (119 out of 178) also feel this way. I prefer to look at things from the point of view of stakeholders: It's doubtful they want you to ship something on time that isn't ready, that they want you to waste their money, that they want solutions that don't meet their needs, or that they want poor quality solutions.

Having said that, each combination of success criteria was chosen by at least one respondent. This is a clear indication that there is, in fact, a wide range of definitions for project success being applied within IT organizations worldwide. As I indicated earlier, it would be clearly inappropriate for me to inflict a single definition of project success on respondents, contrary to other studies. When it comes to relatively on time, relatively on budget, and relatively to specification, only 12% of respondents choose all three of these values when defining success.

How Do the Paradigms Compare?

An interesting result of this series of surveys is that the differences in the success rates are modest and certainly don't reflect the histrionic rhetoric from some paradigm advocates. In 2008/2009 and again this year, we asked how each paradigm fared against each of the four success criteria. For all four success criteria, agile/iterative approaches did statistically better than either traditional or ad hoc, as had happened in the 2008/2009. Furthermore, the areas where traditional were better than ad hoc, and vice versa, were the same this year as in the previous survey. A significant difference this year is that agile appears to have pulled ahead of iterative a bit. Let's explore each success criteria individually:

  1. Time/schedule: Iterative and agile had the same score when it came to delivering a solution in a timely manner, very likely the result of their incremental approach to development. Traditional and ad hoc both fared poorly with statistically the same score, very likely the result of both paradigms pushing risky activities such as system integration to the end of the lifecycle.
  2. Return on investment (ROI): The survey showed that agile is better suited than iterative approaches when it comes to spending stakeholder money wisely, likely the result of the greater focus on streamlining development activities and on quality in agile development. Both traditional and ad hoc approaches did measurably worse when it comes to delivering ROI and were statistically the same. In the case of traditional, I suspect this was the result of the greater bureaucratic overhead; and in the case of ad hoc, the result of additional rework from not thinking things through well enough.
  3. Stakeholder value: Agile did a bit better than iterative approaches at delivering stakeholder value &mdash both paradigms take an iterative and incremental approach, but the difference may be the result of agile strategies stressing active stakeholder involvement more than iterative methods. Ad hoc strategies trailed behind agile and iterative, but were more effective than traditional strategies at delivering stakeholder value. This is something that the 2008/2009 survey also showed and is yet more evidence questioning the value of detailed specifications prevalent in traditional projects.
  4. Quality: Agile and iterative approaches had statistically the same result, as they did in 2008/2009, supporting those paradigms' strategies of testing throughout the lifecycle. I was expecting agile methods to do a bit better at delivering quality than iterative methods due to their greater focus on quality-assurance techniques (such as regression testing, continuous integration, and refactoring), but once again, the survey didn't find this to be the case. As you'd expect, traditional methods fared better with respect to quality than ad hoc approaches &mdash although testing late in the lifecycle is expensive, it does seem to have some benefit.

Where Do We Go from Here?

Once again, this survey has shown that agile and iterative software development methods appear to be more successful than either traditional or ad hoc strategies. This is consistent with previous surveys and is likely part of the explanation for the general trend in the IT community toward adopting agile strategies. The survey also found, yet again, that we don't value the on-time, on-budget, and to-specification mantra that is prevalent in a lot of the IT literature. I invite other methodologists to step back and observe this.

I will definitely continue to run future IT project success surveys. I will also run other surveys, some of which look at specific scaling factors and their affect on project success. Although I haven't found it yet, I have no doubt that there are some situations where traditional approaches have an average higher success rate than agile approaches. Previous surveys have explored the effect of team size and geographic distribution for example, with agile and iterative doing as well or better than traditional regardless of size or distribution. I haven't yet explored the effect of regulatory compliance, domain complexity, or technical complexity on the success rates of the various paradigms. Stay tuned for future results, and please consider filling out my surveys whenever you're invited to. Thanks in advance.


The results of the 2011 IT Project Success survey are available online free of charge, including the original questions, the source data, and a summary slide deck.

My analysis of the 2010 IT Project Success Rate Survey are captured in my Dr. Dobb's article sneakily entitled 2010 IT Project Success Rates.

The 2008/2009 IT Project Success Rates Survey, which ran in late 2008 and early 2009, is summarized in Software Development Success Rates.

I wrote up the results of the 2007 IT Project Success Rates Survey in Defining Success.

You may find my Bureaucracy Isn't Discipline blog post to be an interesting comparison of traditional and agile philosophies.

The Surveys Exploring the Current State of Information Technology Practices page links to the results of all the Dr. Dobb's surveys which I've run over the years.

My Agility@Scale blog discusses strategies for adopting and applying agile strategies in the complex environments.

The Disciplined Agile Delivery (DAD) process framework is a hybrid approach that is people first, goals driven, enterprise aware, and addresses the full delivery lifecycle. There's still a chance to review work-in-progress chapters for the forthcoming book.

Please consider following Scott Ambler on Twitter.

Scott Ambler is the Chief Methodologist for Agile and Lean at IBM Rational.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.