Over the past few years I've to come to the realization that almost any technology can solve most problems given enough time and money. This is not to say that a technology choice is not important, it's just lower in the order of issues facing today's developers. For example, I would not want to use my high-speed computing team writing in Assembler to create a website in PHP: Not because they could not do it, but because it is not what they are good at.
However, I'm beginning to think that the real barrier to getting things done in our profession is the tendency to add three requirements to projects: making them extensible, open, and standard. Let's look at these in greater detail.
Project definitions usually require that the application be extensible to other applications and be able to interact with other applications' data. The problem with using this word in the project is that it is very hard to define. It could mean that I may have to allow for new modules to be added without disrupting the existing application, or it could mean that end users can extend the current delivered content or application in some way. When most teams at ISVs see the word "extensible" in the project plan, they see that their timeline and cost has just grown substantially. There are very few applications that are truly extensible. Sure, a few come to mind — the Eclipse project, Delphi/C++Builder, and Visual Studio — but as you see, most of these are development software. There are others that allow for open APIs to be used to expand the usability of software.
However, in a general business software application, where does extensibility come in? Who tests the extensibility? When is extensibility added? I would submit that adding the concept of "extensibility" to an application should require a new project just for that feature. Sure, you could put in code that could help with extensibility in the future, but good programming practice states we don't put anything into the code that is not going to be used because it adds overhead, maintenance, and complexity to an application that may or may not ever use that feature.[YAGNI —Ed.] So where extensibility is specified, add only the needed functionality and only when it is really needed. This minimal, late approach significantly reduces the amount of work and rework a team needs to do for this nebulous requirement at best.
An example illustrating the real cost of writing extensible code can be found in How (not) to write Factorial in Java: A project requirement that simply states that a feature needs to be extensible, or worse yet, that the application needs to be extensible, fails to specify or anticipate what developers might do to make it happen.
What is an open architecture? Does it help you get the application done quicker, better, and more correctly? If the answer is "no," then who cares if the application is open? For the majority of applications out in the world today, most of the ones that are built on an "open" architecture don't use it enough to deliver that openness. As an industry, we try to adopt these things that claim to be open in the hopes of easier or better integration in the future. However, that point rarely ever arrives; and when it does, the openness is often of little use.
Like the argument for "extensibility," the concept of "open" does not need to be addressed until there is a clear and defined need — for example, if the application must be open to accepting a web service that transmits secure customer information for logging by another system. If, as developers, we worry about a requirement like this when we start development, even though that data is not ready to be emitted by our system, then we end up wasting a lot of time building unneeded infrastructure that most likely will have to be modified when the requirement becomes real and an update to the project is needed. All too often, we see the words "open" in a project and it makes the development and business teams do a lot more busy work trying to define an architecture that may some day in the future need to be open, but at this time, is not needed.
There is enough evidence that states requirements are the key to both project success and failure. Infotech Research stated in 2005: "Flawed requirements trigger 70% of project failures." Just this one non-functional requirement — openness — can significantly reduce the chances of success for the project.
The requirements for standards-compliance is the cause of an incalculable amount of rework. There are people standing around the water cooler, executives setting edicts, and teams figuring out the right features from a smorgasbord of accepted, developing, or proposed standards must be used in the next big project. Why? We all know by now that standards come and go in our industry.
Remember when it was important to support EJB 1.0, 1.1, 1.2; or back in 1998 when it was SOAP 1.0; or was that XML-RPC? The question that must be asked over and over again is, "what did adhering to those specifications do for your business?" I would surmise not much, because if you were using SOAP or XML-RPC as a protocol, you were still doing maintenance projects to get things working with each integration partner. In some cases, your development team worked very hard to be compliant with that 1.1 standard. And if you were 1.1 compliant, and the other integration partner was 1.1 compliant, it may have saved a few minutes of work; but in actuality, the difference between the "bad" 1.0 and "good" 1.1 back then was very minimal. It was mostly marketing hype and FUD (Fear, Uncertainty, Doubt) put out by the submitting vendor.
I knew of many companies that ripped out software that worked just to be on the latest standard. In actuality, these companies did everything needed to integrate a partner or system, but they still needed to go through, at a very minimum, a full regression testing cycle to ensure that things were working correctly. So even though both sides of the equation were compliant with the relevant standard, they still had to do a full project to verify the integration.
So developers spent countless hours making sure to support this "standard" and that "standard," and at the end of the day, the work to integrate partners was still a job that needed a full project and testing cycle to ensure the business activity was working correctly. Chasing after these constantly evolving specifications and standards is causing a lot of churn — much of it wasted if the application does not have the specific need.
This churn costs all of us real money. Estimates using the 1995 Chaos Report indicated that companies would spent about US $81 billion on failed or canceled projects in 1995. Ten years later, the Chaos Report from 2005 showed that we as an industry had become worse, not better; so the number adjusted for inflation and a higher failure rate mean we are losing more investment dollars on almost every single project today.
What Can Be Done?
Beyond keeping an eye out for these three attributes in project definitions and discussions, the key is to bring back "focus" on the important things in a project. One of the questions I ask when training developers is, "what is the most important thing the system you create has to do?" The answers I get back from students is rather telling (shown here with my responses):
- Your software needs to be future proof — really?
- Your software must be easy to read — ok, but is that the most important?
- Your software must be maintainable — ok, again but is that the most important?
- The software must have good documentation — nice to have…
- The software must run fast — nice to have…
- It must be developed using Agile methods — really?
- It needs to run on the latest "x" framework — really?
- It must support both white and black box testing — really, really nice to have!
The list goes on, and on, and on, and on…and few people can just state that the software being developed must work. That is all, technically nothing more and nothing less. If I develop my latest project to be the most extensible, the most open, and the most standard software ever created by man and it can not calculate 2+2 correctly, none of the other answers matter.
Once we have software that works, I'm in favor of the other things. Of course I want to refactor my software for maintainability, of course I want my software optimized, I would love for the software I create to be easily maintained, and I always love working on a agile team — but if the team cannot make the product work, nothing else matters. However, far too many people focus on those other tasks before they have a workable solution.
So in my opinion, words do matter and they matter at all levels. Watch for executive management talking about the dreaded three requirements. Help educate them on how the software you are developing is going to work and save time, money, or personnel for the company. Help the team understand what the main goal of the application is; it needs to work first, then you can do the refactoring and add all the bells and whistles.
With all the work we have done as an industry, the latest statistics show that most of the world still runs on COBOL. Studies show that 90% of financial institutions use COBOL for their majority of processing, despite it being a 50-year old language. There are some 200 billion lines of COBOL still running the show, day in and day out. So how important is it that the next big application run the latest language, use the latest standards, and connect to all kinds of endpoints?
If nothing else, it should point out that always moving to the next language or setting this language or that language as a standard really does not hold a lot of water. Think about how many of you reading this now work for a company that has project standards that must be adhered to (Java, .NET, etc.), but the underlying part of your business is still running COBOL and it is getting the job done.
The sad fact is that most companies could likely do "whatever the requirement was" in the initial technology. That may be COBOL for company X, but it might have been C for company Y, and it may have been Delphi for company Z. Keep in mind that languages do become extinct over time. Things like PL/1 and RPG may still be used in some corner in a basement, but for the most part, those languages are dead. Then again, many languages over the past couple of decades have been pronounced dead, but here they are, alive and kicking. Look at Objective-C, created back in 1980. Did you think back in 1998 when you were worried about the latest SOAP standard that, in 2011, you would be looking at writing your next piece of software in Objective-C, I doubt it!
— Mike Rozlog worked for many years at Borland and CodeGear. Today, he is the Director of Delphi Solutions at Embarcadero Technologies.