We also tested running the tools from the command line and, in our tests, found that
nmake worked fine with makefiles from previous versions.
These features are welcome, of course, but we still found it distressingly slow for the code mechanisms to unflag errors once they'd been corrected. The instant resolution of such flags in other IDEs (such as IntelliJ IDEA in Java) make us wonder why this can't be done in Visual Studio. It's certainly not for shortage of CPU power, as we tested this release on a dual-processor workstation that had eight cores ready to insta-parse code.
Code analysis and the IntelliSense error detection are different beasts. The former is triggered manually and generates a detailed list of suggested corrections and improvements. It is generally run on code that's considered clean by Intellisense and the compiler. Various rulesets are available and these can be further customized, although the process takes some doing. Specific rules can be turned off within the IDE, which is a helpful feature when tackling a long list of false positives. However, we were disappointed that not all rules had explanations beyond the assertion made in the error message.
Moreover, some error messages that resulted from builds were opaque and lacked any explanation even on Microsoft's own website. For example, on the port of one project, we received the enigmatic message: "The operation cannot be completed because BeginBuild has not yet been called." No explanation for this message appears anywhere on MSDN, nor even on the Web at large. Such oversights are not excusable in an established product.
Finally, Visual Studio has a code analysis feature that examines code in a software-engineering sense. It measures complexity, size of files (measured in LOCs), and takes a flyer at estimating the maintainability of the code base. This is to be welcomed, as metrics of this kind are indeed helpful to developers tuned to them. Unfortunately, Microsoft botched the handling of the complexity measure, which is the old stand-by cyclomatic complexity (also known as McCabe). Unfortunately, Microsoft adds up the complexity numbers for all the routines in a file and gives that sum as the file's complexity number. Complexity, though, is inherently a function-level metric. You want to know how complex the code in a given function is (by measuring the total number of executable paths). Typically, you want complexity to be below 10 and always below 20. A tool like Visual Studio offers would be useful if it showed the maximum or average complexity of the methods in a file, so you'd know that something in there needs fixing. But by summing the complexity measures for all functions, you get a useless figure that depends almost entirely on the number of methods in a file (Figure 5).
Figure 5: Showing the invalid cyclomatic complexity figures for a code base.
While this review is mostly centered on the IDE, we must note that the support for various languages has improved. The C++ support now includes the full C++11 library as promised earlier by Herb Sutter, and now enjoys the addition of C++11 features such as range-based loops, stateless lambdas, scoped enums, and SCARY iterators.
This release also includes much better support for parallelization. On the code generation side, the compiler contains auto-vectorization code, which detects SIMD-style registers and uses them for parallel arithmetic when possible. The C++ AMP compiler is now bundled. It enables use of the GPU for general-purpose computing. Not only that, but Microsoft even enables debugging threads running on the GPU (Figure 6), which is a remarkable and commendable feat.
Figure 6: Debugging threads on the GPU. (Courtesy Microsoft)
As we have remarked several times in the past, the future will increasingly be moving towards concurrency and parallel computation. It's clear that Microsoft has broadly embraced this new world and seeks to empower developers with tools do this at the CPU, GPU, and application levels. Kudos here to a job well done. More so if you include Microsoft's recently released Parallel Patterns Library.
A Final Delight
While this review has focused exclusively on Visual Studio 2012 as IDE, rather than including a review of new features in Team Foundation, one delightful feature rides between the two products and will surely appeal to many. Termed "My Work," it enables developers to pause in their work and save everything they were doing: what files were open, cursor locations, breakpoints, and bookmarks. This pile of breadcrumbs is stored on TFS and can be recalled when work resumes in order to restore the IDE to its state when the work stopped (see Figure 7).
Figure 7: Save breadcrumbs in My Work.
Visual Studio 2012 is a whole lot of software. If you can abide the colorless UI and its conflicting design schemes, you'll find a solid package with all the tools you need for developing Windows 8 applications. In addition, you'll get significant updates to the toolset for current platforms in particular, for developers looking to do more concurrent programming. If you're staying on Windows 7 or Windows Server 2008 for the foreseeable future, there is no need to upgrade immediately, especially if you're satisfied with your current work environment. But we expect that as Microsoft continues to push out new technologies, the products will depend increasingly on Visual Studio 2012. As a result, we suggest upgrading whenever a convenient migrating point can be reached. Fortunately, the migration, at least within our limited testing, does not appear difficult.
Our companion article discusses the effects of Visual Studio 2012 and .NET 4.5 on ASP.NET development.