Looking 10 years ahead in the crystal ball, agility has become the norm, but some surprises remain in store for us: New niche modeling tools, changing terminology and a Cobol revival.
By Scott W. Ambler
I’m reporting to you from the year 2013—June 14, to be precise—this column submitted using the new t-mail (time-mail) feature in the beta release Microsoft Linux 23. Apparently, it works by transmitting messages via satellites that are constantly being slingshotted around the sun, thus breaking the time barrier in accordance with the Roddenberry principle. I’ve decided to risk muddling the space-time continuum to describe what’s been happening in software development over the last 10 years.
2004 was the year of the agile buzzword meltdown—the term Agile Big Design Up-Front (ABDUF) was the straw that broke the camel’s back. So agile was replaced by lithe, which was replaced by nimble, and for some reason, swanky has stuck since 2009. Clearly, we’ve given up on the name game.
The term test-driven development was supplanted in 2005 by security-driven development. This was the result of the nine-day Internet shutdown caused by a clone of 2004’s Raelian virus that took advantage of flaws in Microsoft’s .SecureNet (formerly .NET, but now known as .Schwing) Web services. Web services were renamed NetTransactions in 2007, when a workable standard for full two-phase commit (2PC) protocol was finally agreed upon. Extreme Programming (XP) evolved into Really Extreme Programming when the words continuous or continuously were concatenated to the front of every XP practice. Scrum was renamed when its proponents finally admitted that Amercans just weren’t going to understand a rugby analogy, although its new name, Cricket, isn’t much better.
To make things easy for you, I’ll use 2003 terminology to describe the current state of agile software development in 2013. The good news is that agile methodologies clearly dominate the landscape, although close to 30 percent of all projects still follow prescriptive methods such as the Rational Unified Process (RUP) and IEEE 12207–based processes. People still follow prescriptive processes for two reasons: First, the late majority and laggard organizations are just now trying agile processes, and it will likely take them another five to 10 years to fully adopt these techniques. The adoption curve was similar with object technology: Early adopters picked it up in the late 1980s, whereas the laggards did so in the early 2000s, so this delay hasn’t come as a surprise. Second, over the years, we’ve realized that one process isn’t right for every type of project, and that even non-agile processes have their place. For example, good candidates include projects that aim to redevelop an existing system or to implement a system based on government legislation. My guess is that we’ll eventually find that prescriptive processes make sense for between 10 and 15 percent of all projects.
The Proof Is Out There
The adoption of agile processes steadily gained velocity throughout the decade, with an incredible boost in 2005 and 2006, when the five-year “dot-bomb” recession ended. At the same time, several research studies supported the initial anecdotal evidence of the efficacy of agile processes. The first critical study, published in 2006 by CS researchers at the University of Toronto, confirmed that emergent or evolutionary modeling techniques were quantitatively superior to traditional “big design up-front” approaches. Although it still took several years to squeeze the air out of the traditionalists’ bubble, this seminal paper proved to be the turning point that spearheaded acceptance of agile software development techniques.
Subsequent research revealed that what were formerly considered “best practices” were little better than compensatory. For example, the quality assurance community had been telling us for years that reviews and inspections were best practices, and, to be fair, this held true for non-agile environments. However, agile developers quickly realized that these techniques were really “process smells” that indicated a fundamental mistake elsewhere. The XP community showed us that, supplanted by the practices of collective ownership and pair programming in combination with following coding standards and mercilessly refactoring, code inspections were superfluous activities that did little more than justify the existence of the QA staff. Similarly, the agile modeling community showed that collective ownership of models, modeling with others and active stakeholder participation ended the need for model reviews.
In 2004, Tablet PCs began to make a huge impact on modeling. People started using them to sketch requirements-oriented artifacts, often by hand, but sometimes with business process models and even screen sketches. Project stakeholders discovered that tablet-based sketching software was versatile and accessible, enabling them to easily explain their ideas to developers, sketching as they described the requirements, and producing an electronic diagram that could be preserved and updated later as needed. More importantly, collaboration software, in combination with telephone and video technology, enabled stakeholders to model with others who were physically located in different locations. By 2006, Tablet PCs were seen as the simplest tools available for gathering and documenting user requirements.
This boom in simple sketches revealed that the ballyhooed documentation-generation features of analysis-oriented tools did little more than cater to bureaucrats. Leading organizations discovered that if someone complained that sketches were inappropriate development artifacts, it was a sure sign that he added little value to the software effort and therefore could be safely let go. Both developer productivity and project success rates rose measurably during this period.
Surprisingly, the move to sketching software resulted in a renaissance in the modeling tool market. When it became clear that people were abandoning traditional analysis-oriented modeling tools in favor of table-based sketching software, the tool vendors quickly refocused on visual integrated development environments (VIDEs) oriented toward developers.
Furthermore, instead of trying to be everything to everybody, modeling tools quickly focused on individual niches: Some dealt with J2EE-based business applications, some with C#-based business applications, others with embedded software and so on. Although most tools had supported the Object Management Group (OMG)’s XMI “standard” for sharing models, this never worked in practice—each vendor implemented the standard differently, dropping critical information whenever a model was imported into another tool to make model sharing viable. This occurred in part because tool vendors had to fill in the gaping holes in the Unified Modeling Language. Unbelievably, the OMG didn’t add user interface and data-oriented models until 2009, forcing vendors to create their own incompatible standards. More importantly, tool vendors really didn’t want to make it easy for their competitors to import their models, thus enabling customers to switch to a competitor. In short, the marketplace proved XMI to be little more than an academic pipe dream. The OMG’s Model-Driven Architecture (MDA) vision died on the vine because modeling tools didn’t interoperate, and a single vendor was never able to build an über-tool that covered a sufficiently wide range of environments.
Cobol Rises Again
Believe it or not, there’s currently an agile renaissance in the Cobol world. Yes, Cobol! It began a few years back, as the original Cobol generation started to retire en masse and was replaced by Java and C# programmers now in their late 40s and early 50s. There’s still a significant Cobol code base that needs to be maintained and enhanced, and, strangely enough, many object developers are excited by the opportunity to work in a “new” environment—Cobol has become the sexy new technology.
This new blood brought their own techniques and technologies, particularly agile ones, to the Cobol community. Until then, techniques such as pair programming, test-first development and evolutionary design never really caught on with “traditional developers.” The new Cobol generation ported the wide range of open-source software tools that they’d been using for years to the Cobol environment—including unit testing, configuration management, build management, code refactoring browsers and database refactoring tools. Agile Cobol—go figure.
The development community of 2013 is very different than the one of 2003. Developers now need both technical and people-oriented skills—without this holistic blend, it’s tough to find work in the smaller job market that has resulted from large-scale adoption of agile techniques. Successful developers are generalizing specialists, possessing specialized skills in one or two aspects of development and general skills across the entire lifecycle. Developers are now skilled craftsmen or are in the process of becoming so: People who actively seek to learn new skills and who act in a professional manner at all times. It’s an exciting era.
Scott Ambler is a senior consultant with Ronin International Inc. His latest
book is The Elements of UML Style (Cambridge University Press, 2002).