Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Channels ▼


How Agile Are We Really?

Scott W. Ambler is Chief Methodologist for Agile and Lean, for IBM Rational

In This Issue

  • How Agile Are We Really?
  • Hot Links

How Agile Are We Really?

For several years now I've been running surveys, both for Dr. Dobb's and for myself, to determine what's really going on out there. Many of these surveys focus on agile development, including how successful agile project teams are in comparison to teams following other development paradigms. Although I provide a definition for each paradigm in these surveys, they still rely on the fact that people need to identify which of their teams are agile and which are not. But, are these so-called agile teams really agile? This sounds like a simple issue, but because there is no agreed to definition of agile, nor is there agreed to criteria for determining if a team is agile, the issue proves to be quite daunting. So I thought I should look into this via a new survey.

I ran the survey during the last week of July 2010 and the first week of August. There were 293 respondents, 33% indicated that they were programmers or agile team members and 39% were managers or team lead/Scrum Masters. 64% had 10+ years in IT, 15% worked in organizations with 500 or more IT people, 39% lived in North America, 29% in Europe, and 12% in Asia-Pacific countries. A link to the survey was posted on my IT Surveys home page at www.ambysoft.com/surveys/, announced on my Twitter feed, my mailing list ([email protected]), the Agile Alliance discussion group on LinkedIn, and the Yahoo discussion groups for test driven development (TDD), agile modeling, and agile databases. As usual, this is an open survey in that you can download the source data, the original questions, and a summary slide deck free of charge from my IT Surveys home page at http://www.ambysoft.com/surveys/ .

How do you judge the "agileness" of a team, particularly when there is no real consensus over what it means to be agile to begin with? One strategy would be to look at the practices which they are following. The problem is that there are numerous agile practices, numerous agile methods, and numerous terms used for similar concepts. Another strategy is to determine whether a team is achieving certain goals or addressing certain issues. The problem then becomes one of determining which goals/issues are agile and whether or not they're truly being fulfilled. A third strategy is to combine the first two, which is exactly what I've done over the past few years with the following five criteria:

  • Validation. Developers on agile teams validate their work to the best of their ability throughout the lifecycle. They will at least perform their own continuous regression testing and better yet take a test-driven approach to development. Granted, they may still work with independent test teams, particularly in complex situations or in regulatory environments.
  • Self organization. Agile development teams are self organizing, which means that the team members themselves plan and estimate their own work. Better yet, disciplined agile delivery teams will also work within an appropriate governance framework to help guide their decisions to the betterment of their organization.
  • Value. Disciplined agile teams provide a consumable solution on a regular basis which provides value to their stakeholders. Notice how this statement is different than Scrum's goal of "potentially shippable software" every sprint. First, "consumable" is more robust than "potentially shippable" because it also brings issues such as usability and timeliness into account. Second, "solution" is also more robust because agile delivery teams typically work on more than just software, they also deliver updates to hardware, they often change the business process around the usage of the system, and they may even impact the organization structure of the end users.
  • Stakeholder involvement. Agile delivery teams work closely with their stakeholders, or a stakeholder proxy such as a product owner, on a regular (ideally daily) basis.
  • Continuous improvement. Agile teams will regularly reflect on how they work together and then act to improve on these reflections in a timely manner. Disciplined agile delivery teams will also measure their progress, using that information to increase their improvement efforts.

I've used these five criteria at numerous organizations around the world to help flush out the ad-hoc delivery teams which claim to be agile but really aren't in practice. My experience is that I can often identify the "posers," the teams claiming to be agile but which really aren't, with the first two criteria and almost always get them with the first four. The latter observation has led me to believe that the fifth criterion, the one focusing on continuous improvement, although an important strategy isn't a determining factor for the agileness of a team. Jumping ahead to the answer, the survey has in fact born this observation out: 54% of respondents who claimed to be on an agile team, or had previously been on an agile team, actually worked on teams which met the first four criteria and 53% met all five criteria. Let's see what happened.

I explored each criterion with a "multiple selection" question. My strategy was to explore what people were actually doing on their teams that they believed to be agile, giving respondents options including agile strategies, non-agile strategies, and strategies that were neither (or both depending on your point of view). In four of the five criteria there were several options that were deemed acceptable from an agile point of view. In other words, as long as you chose at least one of the options I was willing to give you credit for addressing the goals of that criterion. For one criterion, self organization, you needed to adopt two techniques to be given credit.

When it came to delivering value regularly 94% of agile respondents were given credit for doing so. I was looking for at least one of two options. In this case, 87% of respondents indicated that they were producing working software every iteration/sprint during construction, and 40% indicated that there were one or more iterations/sprints at the start of the project where they did not produce working software due to project initiation considerations. The second factor is potentially a bit of a cheat because the teams could do this and still be very non-agile during the construction phase of the project. However, I wanted to be as liberal as possible in my analysis of the data and I didn't want people thinking they were non-agile if their team choose to do some strategizing at the beginning of the project before jumping into cutting code. Other value-oriented options, which didn't earn respondents any credit towards this criterion, included teams which claimed to be implementing business process improvements (58%), teams which identified stakeholder goals at beginning of the project (71%), and teams which actively considered usability issues (69%).

It was also common, 87% of respondents, for people to be on teams where the developers were validating their own work. Indicating at least one of three options gained you credit for this criterion: 71% indicated their team performed their own regression testing on a regular basis, 52% took a test-driven development (TDD) approach at the design level (e.g. via xUnit), and 44% took a TDD approach at the requirements level (e.g. via acceptance tests or story tests). Other validation strategies, which didn't earn credit towards this criterion, included demoing their solution at the end of each iteration (79%), running "all hands" demos (42%) occasionally during the project, including static code analysis in their build (32%), and including dynamic code analysis in their build (21%).

When it comes to stakeholder participation, 95% of respondents fulfilled this criterion. Once again, selecting at least one of three options earned you credit: 74% of respondents worked on a team which had a product owner who represented the stakeholder community, 63% worked with specific stakeholders (particularly domain experts) on an as-needed basis, and 62% had access to stakeholders (or their representatives) on a daily basis. The demoing strategies which where explored via the validation criterion are also aspects of stakeholder participation, but they don't count towards credit for this criterion as I don't consider attendance at demos to be sufficient evidence of active stakeholder involvement.

The most problematic criterion was self organization, perhaps because it was hardest to earn, with only 56% of respondents on teams fulfilling this goal. In this case the respondent needed to indicate that they had adopted two strategies: each iteration/sprint they would hold a planning meeting where the team determines who will do what that iteration (71%) and they would also hold daily stand-up meetings to coordinate their activities (77%) throughout the iteration. A plausible explanation for the low adoption rate of self organization is that it requires a significant cultural shift in many organizations -- developers organizing their own work instead of following the directions of a project manager? Heresy! This is a shift which many managers find threatening, particularly those who focus on the technical aspects of management instead of the "fuzzier", and more valuable, leadership aspects. This issue is clearly something I need to look into in a future survey. Interestingly, when this criteria is dropped from the equation (something that I do not recommend doing) then 72% of respondents claiming to be on agile teams potentially are (up from 53%). Furthermore, the 2008 Agile Practices and Principles survey found that iteration/sprint planning and daily stand up meetings where rated as the two most effective agile management practices, indicating to me that many respondents on this survey were missing out on potential productivity improvements.

The fifth criterion focuses on reflecting on, and then improving, the process which the team follows. In this case 88% of respondents were given credit for this, either by indicating that they hold a retrospective/reflection session at the end of each iteration/sprint to identify potential improvements for their team (68%) or because they held a retrospective session several times throughout the project, but not every iteration, to identify potential improvements for our team (22%). The first strategy reflects approaches common in agile methods such as Scrum or Disciplined Agile Delivery (DAD) but the second reflects lean methods such as Kanban. Related techniques, which didn't earn credit towards this criterion, included actively trying to address the issues identified in the retrospective sessions throughout the project (70%), measuring and tracking progress of adopting improvements to our process (42%), and external auditors potentially reviewing what they were doing during the project to help identify potential improvements (14%).

There are several thoughts I want to leave you with:

  • Whenever someone claims to be agile you clearly need to take it with a grain of salt. Although there isn't an official way to measure the agileness of a team, even following my fairly liberal criteria barely half of people claiming to be on agile teams potentially are, but at least it's an indication that you need to be discerning.
  • It may be that the existing management and governance culture within your organization is preventing project teams from fully benefitting from agile. I'm reticent to jump on the "blame managers for everything that's wrong with the universe" bandwagon but in this case there may be need for some serious soul searching amongst the leadership within your organization.
  • The 2010 IT Project Success Rate survey showed once again that agile teams were doing measurable better than traditional teams. The survey revealed that the so-called agile teams may not be completely agile yet. Together, these results are more evidence that teams are benefitting from adopting some agile strategies, once again motivating us to question the "Scrum but" and "pure agile" rhetoric that is common within the agile community. Granted, teams could very likely benefit from adopting more agile strategies than they currently have and for many agile practices there is research showing benefits from adopting that individual practice anyway. But, it's nice to see more evidence supporting reasonable, middle-of-the-road adoption strategies.
  • At some point the agile community really does need to truly define what it means to be agile and, better yet, identify criteria against which to rate teams. Don't get me wrong, the Agile Manifesto is a great philosophical statement but it's not a very good definition -- we can and should do better.

Many experienced agile people will claim that you know agile teams when you see them, which works really well if you have years of real experience with agile teams. This article describes a straightforward technique for determining whether a team is agile and some industry numbers to compare your organization against. Nobody is going to give you an award for being agile, but they may give you a bonus for improving your overall productivity, and adopting agile strategies is one way to do so.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.