Alexander Pope once said about opinions: "[they are] as our watches, none go just alike, but each believes his own." This view is equally true in the programming of build systems. Everybody likes their own. Or, perhaps more accurately, dislikes it less than the alternatives. What is curious is that, fundamentally, build systems all have the same mission: transforming code and possibly other artifacts into a deliverable executable package. In theory, what should vary between build tools is the degree to which automation facilitates the process. Some tools, for example, automatically detect dependencies, check the modification date of files, and rebuild only the binaries that have changed. Other systems accelerate builds by executing them in parallel a task that, in some languages, is supremely difficult.
- CIOs strive to harness Big Data while keeping an eye on the bottom line
- Enterprise architects challenged to manage data explosion
- State of Cloud 2011: Time for Process Maturation
- Research: Federal Government Cloud Computing Survey
- Architecting Private and Hybrid Cloud Solutions: Best Practices Revealed
- Big Data and Customer Interaction Analytics: How To Create An Innovative Customer Experience
While those differences do distinguish high-end from less feature-laden systems, they tend not to be the key influence on the choice of build systems. The overwhelming factor in choice is the principal language in which the software was implemented. If you use C or C++, make or one of its derivatives is your tool. In Java, it's Ant or Maven. In Ruby, it's Rake. And so on. Few developers, if any, use make to build Ruby apps, for example. We all tend to stick to the natural choices in our own ecosystem.
Some programmers I respect, who recognize that the fundamental nature of build systems is to simply run a series of utilities to create a final package, use hand-coded shell scripts. They argue this approach provides some advantages: a uniform syntax for all projects regardless of language and complete portability across platforms. This is all true, but the approach sacrifices automation. Every step and every dependency must be hand-coded. And desirable tasks like checking mod dates or dependencies become complex activities.
The majority of dedicated build systems, until recently, have been target-and-dependency models. The build script identifies the dependencies and explains how to convert input to output. Ant and make (and their descendants) are archetypes of this approach. This design is cumbersome: The tools frequently lack the ability to test results and make intelligent decisions about what to do (as they could do in a shell script). Products like Gant (an elegant Groovy-based shell with scripting capabilities built around Ant) and Buildr (a similar concept with Ruby as the scripting language) can certainly help.
But the direction of modern build systems is toward the use of convention over configuration. Maven was the first widely used tool to exploit this approach. Code goes into known locations in a project; tests go in another standard directory; and simple instructions (alas, in XML) tell Maven what to do, but they don't identify most of the artifacts that are to be consumed. Conventions tell Maven where they are located. And Maven handles dependencies by downloading libraries automatically from Maven Central or other repositories.
There's a lot to like in the Maven approach and there's no doubt that it continues to gain adherents, but it can be a frustrating tool when you want to do something that's not in the model Maven is expecting. An example of this comes in the form of Android projects, which have their own directory layout. A Maven plugin maps Android to Maven, but it's a contortion to satisfy Maven's convention, rather than allowing the fundamental convention to be modified. Also, because magic is being done by convention, fixing errors can be frustrating as there is little to guide you when things go wrong.
Ideally, there should be a tool that uses both convention over configuration and enables easy scripting, so that all needs and perhaps even all languages are accounted for. I'll come back to this in a moment, but I first want to look briefly at what the future will require.
We have come to the point in software development where most sites recognize the benefits of continuous integration (CI) and continuous delivery. The core principle is the need to build and test the deliverable with every code commit. In continuous delivery, the deliverable is then installed on a test server and run, and further tests are run against it to make sure that the organization always has a valid executable of the current project even if it's not feature-complete. Today, this is activity is driven by CI servers. But increasingly, tasks assigned to those servers are migrating to the build scripts. This migration is most evident in the now standard test step in many build tools. Specifically, the successful running of all unit tests is required for the build to succeed. I expect that more of the deployment and downstream testing will move from the CI server to the build script, and eventually the CI server will disappear as a separate entity, save to provide some kind of dashboard regarding the results of the build.
To accelerate in this direction, build tools need to move far past the early model of make, scons, autotools, Ant, and the like. Convention over configuration and scripting will both need to be robust. The tool that, in my view, most fills the bill is Gradle. It uses a DSL to script activity that, in its principal direction, is driven by convention. It requires the JVM, as Gradle was developed in Java and Groovy, but it is more language-independent than most of the tools I've discussed here. (To wit, it is often used in Node.js projects.) In addition, Gradle's default lifecycle includes testing and deployment stages, so it's designed for the full DevOps/continuous delivery model. The tool itself is open source, and it currently has a commercial entity behind it, reflecting the traction it is gaining in development organizations.
If you're grappling with the issues I discussed here, I suggest looking at Gradle. Naturally, most organizations are not going to rewrite existing build scripts to use the product you don't mess around with what works. But as the pressure to have build system do more and do it better, I expect that organizations will increasingly find Gradle to be a satisfactory solution for new projects.