Jeffrey has been programming for 40 years. He currently is doing freelance writing and Ruby on Rails development after decades of C/C++ programming with sidetrips through university teaching. He can be reached at firstname.lastname@example.org.
Looking back over the last several decades in software development, I see several trends converging in Hypothesis-Driven Development (HDD). A hypothesis is just a theory before it's tested. A hypothesis like a theory should have both explanatory and predictive value. HDD both unifies existing practice and suggests additional areas where the scientific method can be used in product and software development. To wit: Testing is no longer limited to the end of the project; customer discovery and lean startup test assumptions about customers, their needs, and pain points before coding starts; Agile development [1, 2] is breaking down the walls between stakeholders (marketing, system architects, programmers, testers, and end-users); Just-In-Case, the model of primary and secondary education, is being replaced by Just-In-Time (JIT) ideas taken from lean manufacturing (e.g., lean software development ) and Kanban [4, 5]); the collections of introverted, asocial programmers are diversifying into teams with more varied skills, especially people skills; and the door between the Cathedral and the Bazaar [6, 7] is opening wider.
Inexpensive computers, open source libraries, and cloud computing have opened up software product development to small startups, and lean startups can outrun and out-maneuver massive bureaucracies full of employees busily avoiding the blame for any possible failure. (Note: Teams developing a new product within a large corporation are well served by behaving more like a startup.)
HDD in Action
Let's look at some areas where HDD can come into play immediately. Quality Assurance is testing system behavior against the specification, a hypothesis about desired system behavior. End-users will test the system specification against reality. So why not get something in front of them early to test your hypotheses about what they need, want, and will pay for?
Unit Testing  moved testing into coding. Test-Driven Development (TDD) [9, 10] moves it before coding Behavior-Driven Development/Design (BDD)  moves it into design. And some BDD frameworks, such as Rspec , are readable by non-technical people, broadening the selection of who can do testing and allowing domain experts (such as an experienced end-user) to test the test for an HDD approach.
Customer Development [13, 14, 15] encompasses interviews with potential customers to clarify the problems, validate a solution, and determine demand for different pricing and business models (freemium, free trials, subscription, seat/site licenses). For HDD, technical people need to be included in these interviews (and people skills are needed).
All of these hypotheses — about markets, customers, patterns of usage, how code should behave — need to be written down, dated, and visible to all stakeholders. This is the backbone of HDD: It keeps everyone honest and reduces "reality distortion" fields. The important and/or critical hypotheses need to be tested. Confirming and disproving hypotheses drives development.
Getting Started with HDD
Hypothesis generation is the first step, and will recur throughout the project. Brainstorming is an excellent way to start. People will bring their own ideas to each session and ideas will play off each other, combining in new and interesting ways. Record them all — whether utilizing high tech (recording and/or remote whiteboards, audio/visual recording) or low tech (photographing whiteboards, drawing on huge pieces of paper, writing on index cards), all hypotheses need to be cataloged.
In a subsequent session, sort through the ideas/hypotheses. How can they be tested? How expensive is testing and how expensive could not testing them be? When can they be tested? Some hypotheses may be too hard or expensive to test, but it is good practice to state them anyway so all stakeholders can acknowledge and agree on them. And costs change. Markets change. Both tested and untested hypotheses may need to be revisited. Hypotheses that are too expensive for a project to test may make economic sense when the benefit is spread across many projects.
Hypothesis testing in the initial stages of a project will be mostly qualitative: talk to stakeholders, potential customers, and users (future stakeholders). Record them some way, preferably both the raw notes and the summary. Ambiguities occasionally creep into summaries, so it's nice to have the raw data to go back to. It is best if the designers and developers do some of the interviewing.
The hypotheses that emerge from qualitative testing with a few dozen people can undergo quantitative testing (validation) with a few hundred people through surveys [16, 17] Usability testing with wireframes and mockups can be used in both stages.
There will always be more hypotheses than can reasonably be tested. Some will cost more to test than any possible foreseeable benefit. Two complementary principles can help here — minimizing risk and maximizing learning. Committing to a technology that isn't ready for primetime can be an expensive mistake. Tests need not be polished to give good results — an ugly pop-up or a clumsy survey today may be enough to answer the question. Take the Extreme Programming/Agile Development principle to heart, "What is the simplest thing that could possibly work" to validate or disprove a hypothesis. Ken Thompson (of C and UNIX fame) is more blunt, "When in doubt, use brute force."
Staying Hypothesis Driven
Premature optimization is the root of all programming evil.
— Donald Knuth
HDD affects coding practices as well as process. HDD requires that you write code that is easy to test, profile, benchmark, and change. If intermediate results are interesting and might shed light on some hypothesis, make them easily accessible. Choose a usable format, even if it will not be the format used in the ultimate system.
Monitoring (for example, CPU usage, transaction rates) is a beneficial real-time hypotheses testing practice that will provide important information about actual loads, necessary capacity, and the robustness of your architecture.
Keep in mind that a hypothesis that was once valid may later be disproven as technology and markets change. Markets move, so you need to continually ask whether the business model is still valid.
Additionally, projects need ways to record contrary data and "good enoughs" need to be revisited regularly. Annoyances, pain points, and ugly or clumsy things become opportunities in HDD. This is a hard point to get across and it's act on. I've missed many opportunities to do something noteworthy because I accepted a "clumsiness" instead of revisiting it and working to improve it.
Pivots and mid-course corrections are ways to handle disproven hypotheses. Some hypotheses that seemed too expensive to test may be revealed to be too expensive if not tested, either because the tests have become cheaper or the cost of being wrong is found to be higher.
In the immortal words of John Johnson, "First, solve the problem. Then, write the code." But also remember to solve the correct problem, then solve the problem correctly. And with HDD, you can always optimize the solution later, if needed.
 "Manifesto for Agile Software Development" http://agilemanifesto.org/
 "Agile Alliance" http://www.agilealliance.org/
 Lean Software Development: An Agile Toolkit, Mary and Tom Poppendieck, Addison-Wesley Professional (May 18, 2003)
 "Kanban Development Oversimplified" http://www.agileproductdesign.com/blog/2009/kanban_over_simplified.html
 "Kanban" http://en.wikipedia.org/wiki/Kanban
 "The Cathedral and the Bazaar" http://catb.org/~esr/writings/homesteading/
 The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary, Eric Raymond, O'Reilly Media (January 15, 2001)
 "JUnit.org Resources for Test Driven Development" http://www.junit.org/
 Test Driven Development: By Example, Kent Beck, Addison-Wesley Professional (November 18, 2002)
 "Introduction to Test Driven Design (TDD)" by Scott Ambler http://www.agiledata.org/essays/tdd.html
 "Behaviour-Driven Development" http://behaviour-driven.org/
 "Rspec.info" http://rspec.info/
 "The Four Steps to the Epiphany" by Steve Blank Cafepress.com
 "The Non-Dummies Guide to Customer Discovery" http://steveblank.com/2010/08/26/the-non-dummies-guide-to-customer-discovery/
 The Entrepreneur's Guide to Customer Development: A Cheat Sheet to the Four Steps to the Epiphany, by Brant Cooper and Patrick Vlaskovits, Cooper-Vlaskovits (July 29, 2010)
 "Lean Startup" http://en.wikipedia.org/wiki/Lean_Startup
 "Running Lean" by Ash Maurya (in preparation), first two chapters available at http://www.runningleanhq.com/