Channels ▼
RSS

Design

Chronic Requirements Problems


There are three widespread problems with software requirements that need better solutions than are presently customary for software projects:

  • Many requirements are dangerous or toxic and should be eliminated.
  • Some clients insist on stuffing extra, superfluous features into software.
  • Requirements are never complete and grow at rates greater than 1% per calendar month.

Software engineers have an ethical and professional obligation to caution clients about these problems and to assist clients in solving them, if possible. In other words, software engineers need to play a role similar to that of physicians. We have a responsibility to our clients to diagnose known requirements problems and to prescribe effective therapies.

Once user requirements have been collected and analyzed, then conformance to them should occur, of course. However, before conformance can be safe and effective, dangerous or toxic requirements have to be weeded out, excess and superfluous requirements should be pointed out to the users, and potential gaps that will cause requirement creep should be identified and also quantified. The users themselves will need professional assistance from the software engineering team, who should not be passive bystanders for requirements gathering and analysis.

Unfortunately, requirements defects cannot be removed by ordinary testing. If requirements bugs are not prevented from occurring, or not removed via formal inspections or other methods, test cases that are constructed from the requirements will only confirm the errors rather than find them. (This is why years of software testing never found and removed the Y2K problem.)

Another issue is that for some brand new kinds of innovative applications, there may not be any users other than the original inventor. Consider the history of successful software innovation such as the APL programming language, the first spreadsheet, and the early Web search engine that later became Google.

These innovative applications were all created by inventors to solve problems that they themselves wanted to solve. They were not created based on the normal concept of "user requirements." Until prototypes were developed, other people seldom even realized how valuable the inventions would be. Therefore "user requirements" were not completely relevant to brand new inventions until after they have been revealed to the public.

Given the fact that software requirements grow and change at measured rates of roughtly 1% to 4% per calendar month during the subsequent design and coding phases, it is apparent that achieving a full understanding of requirements is a difficult task.

Software requirements are important, but the combination of toxic requirements, missing requirements, and excess requirements makes simplistic definitions such as "quality means conformance to requirements" hazardous to the software industry.

After Delivery

The issue of "growing requirements" is frequently underappreciated. Once software applications have been delivered to clients or customers, requirements do not stop growing and changing. For most applications, growth is continuous for as long as the applications are in use. They tend to grow at rates of up to about 15% per calendar year forever.
Because requirements and applications continue to grow, this means that application size increases, too, whether measured with function points, logical code statements, or any other metric.

To illustrate this continuous growth, the following table shows typical growth of a large Java application, based on my research.

Measurement Intervals Function Points Logical Code
Statements in Java

1 Size at end of requirements

10,000

530,000

2 Size of requirements creep

2,000

106,000

3 Size of planned delivery

12,000

636,000

4 Size of deferred features

- 4,800

- 254,400

5 Size of first delivery to clients

7,200

381,600

6 Size after year 1 usage

12,000

636,000

7 Size after year 2 usage

13,000

689,000

8 Size after year 3 usage

14,000

742,000

9 Size after year 4 usage (mid-life kicker)

17,000

901,000

10 Size after year 5 usage

18,000

954,000

11 Size after year 6 usage

19,000

1,007,000

12 Size after year 7 usage

20,000

1,060,000

13 Size after year 8 usage (mid-life kicker)

23,000

1,219,000

14 Size after year 9 usage

24,000

1,272,000

15 Size after year 10 usage

25,000

1,325,000

These numbers indicate larger-than-average growth at year 9 and year 13. For commercial software, it is necessary to add significant new features in order to stay current with competitive applications. These are called "mid-life kickers."

As can be seen, requirements growth never stops for as long as software applications are being used unless the developer withdraws support due to the release of a new product of the same type. Of course, some applications continue well past 10 years. For example, the U.S. Air Traffic control system have been in use for more than 30 years.

So Then…

Software requirements have been a very weak link in the chain of software engineering technologies. Because requirements are always incomplete and always contain errors, it is the responsibility of the software engineering team to ensure that state-of-the-art requirements methods are used. Users are not trained in requirements methods and cannot provide requirements that are complete and error-free without assistance from trained requirements experts, plus state-of-the-art requirements tools. Most importantly, software engineers should expect — even embrace — that this dialog with users about requirements will go on long after applications are delivered and running.


Capers Jones is a Vice President and the Chief Technology Officer of Namcook Analytics LLC. He collects data on software quality and productivity topics. He has written more than a dozen books on software quality, best practices, estimation, and measurement.

Andrew Binstock is on vacation.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Comments:

ubm_techweb_disqus_sso_-00159266d48ab53ae8a1df947d44703f
2012-12-17T15:23:25

Knowing the scope of the project and what that fully entails would e,imitate the need for any unwanted or side ideas that occur during development. I mean there are going to be things that definitely come up and need to be addressed but excess items should be skipped and the focus of the original task at hand and successful implementation. I believe in today's age of technology and development user requirements are one of the main points when developing a plan of action. I also think that with any action it should be designed with ability to grow if the the company grows or not that should just be good practice.

Paul Sprague
InformationWeek Contributor


Permalink
ubm_techweb_disqus_sso_-9323c3011892751b33193b386c38021f
2012-12-01T01:06:26

There can be no hard and fast rules, but non functional requirements frequently fit that term - expectations as to performance that cannot really be met for example. Often tis is caused by network or internet latency which is out side the control of the software team. Requirements for one system that turn out to be dependencies of another and may not be possible.

Sometimes customers want real time updates which can be very difficult in some circumstances, and turn out not to be necessary - sometimes a batch update hourly or daily will do.

Sometimes a customer wants everything automated, but the cost of the last 10% is 100% of the original budget, and a small amount of manual processing is actually workable.

A key factor in getting good requirements is to get the customer focused on their objectives and the problem they are trying to solve. Then start guiding them through the requirements needed to meet the goals.


Permalink
ubm_techweb_disqus_sso_-9323c3011892751b33193b386c38021f
2012-12-01T00:57:44

yes, and having "rule #1 is "the customer always lies" is great communications. What a bozo that guy is.


Permalink
ubm_techweb_disqus_sso_-3d310a5d9a69e485ffd9705abd8ca781
2012-11-30T20:52:23

What is a "dangerous or toxic" requirement? How would I distinguish a "dangerous or toxic" requirement from one which is not "dangerous or toxic"?


Permalink
ubm_techweb_disqus_sso_-9822f45c937abae34685d8d4da2d8162
2012-11-29T05:06:12

Certainly if you just ask business people what they want and call that "requirements", then those "requirements problems" will play havoc with development. The thing is, those are not actually requirements; they cannot be determined to be correct, complete, clear and unambiguous. To get that, you don't ask people what they want, you ask what they do and what information they need to do it. That is the basis of "state of the art" requirements methods. These are methods performed by business analysts, usually before the need for software is determined.

Have a look at http://www.iag.biz/business-an... and other sources to see what I am talking about.


Permalink
dblake950
2012-11-28T16:57:38

Obviously, as with so many other things, good communication is the key.


Permalink
ubm_techweb_disqus_sso_-16d7c50d6c078abc2acbddb01ccf46d4
2012-11-28T13:43:11

As you well stated, a good software engineer must be a good diagnostician. Many times, end-users present their issue, but it is actually only a symptom. But how do software engineers know whether it is just a symptom and not THE real problem. By asking questions!

Medical doctors have to be good diagnosticians also. They observe, ask questions, run tests, look for cause/effect. Software specification development is not much different.

We have a list of rules around here, and rule #1 is "the customer always lies". That doesn't mean they KNOWINGLY provide false information, but you have to realize that THEIR PERSPECTIVE is not necessarily the best. They also do not want to appear stupid.

That does NOT mean you ask 1 question, assume the customer is lying, and do the exact opposite. NO! It means you need to ask good questions that gets the customer to describe the layers and layers of "symptoms" (in their mind, problems), until you see/understand their root problem. Is this trivial? Sometimes (rarely) yes, but most of the time, no.

I've learned that believing "the customer is always right" can be taken to extremes and you end up with a feature/product the customer THOUGHT they wanted, but once they actually have it, they HATE it.

The main point here is to view the customer's "problem" as an opportunity to better understand THEIR problem domain, NOT your solution domain.

The worst kind of customer is the one who tells you how to fix their problem (they won't even tell you what they think the issue is).

A good diagnostician should ask good questions to find out the "root cause" of the issue, and attempt to come up with an excellent solution, or even better, a set of possible solutions. Remember, though, to try to keep the discussion in the customer's domain/language (isn't that what good software does anyway? Solve customer domain problems?)


Permalink

Video