The recent announcement that VMware would soon match Microsoft Azure and provide a virtual machine (VM) of the cloud host environment touches on a theme that's been near to my heart for several years; namely, that developers grossly underutilize virtualization on their development platforms.
The recent wave of interest in virtualization began nearly 10 years ago with VMware's release of its Workstation product. This tool enabled us all to test how code might run on a different OS. Because it was sold that way, I believe, it imprinted itself as the primary use case for VMs on the desktop/laptop. The sole exception might be power users on the Mac, who use Parallels to run Windows-only software.
There are, however, other important use cases. The first is testing software you're developing. A best practice is to not test your software on the same machine you're developing it on. This is even more important if that machine also hosts your mail and private data. If the software corrupts your system, you're dead in the water. As technical experts, we love to believe we can anticipate how software will behave and that this can't happen. This carefree approach works fine until some small upgrade requires a new DLL or an older version of a library; then, all of a sudden, other software you depend on doesn't work quite the same. Run your executables in a VM and you can tweak the environment all you want without ever worrying about disturbing your system.
A second good reason to do this is so you can test your code on a configuration profile that matches the actual delivery environment. For example, suppose you're writing a CRUD app for your IT department. If your company conforms to the best practice of delivering a standard configuration to users, you can create a VM with this configuration and then run your test in the VM. This step assures that you're duplicating the user's runtime context. It avoids silly situations like releasing an app that unaccountably requires Cygwin when users to go install it, because, eh-hem, you forgot that that package was already installed on your system. ITIL-oriented shops will recognize a best practice in this enforcement of a standard deployment configuration. You should, too.
Evaluation of new software products and releases is a regular activity for developers: new tools, new releases of an IDE, a new device driver. Each installation twiddles with the registry (in Windows) or the config files in Linux/UNIX-based systems. Installation and deletion leave behind files, settings, and other artifacts that clutter all systems regardless of OS. Why have these junk bits that slowly but inexorably destabilize your system? Test all new products in a VM. Not only does this spare your system the mess, but you can save the VM on a NAS for future re-examination of the product without having to reinstall it.
A final use case for VMs is rarely ever discussed, but is compelling when you consider it seriously: Safe Web surfing. Like most readers, I surf the Web essentially unprotected save the antivirus and firewall software on my laptop. This generally works because I most often go to sites that, in my imaginary world, are known safe. However, periodically I descend into Web searches far afield from the mental map of the safe Web neighborhoods. Try looking up anything that could be remotely linked to sex or search for photos that might be disturbing to sensitive viewers and you enter into a warren of dark places in the underbelly of the Net. I have no desire to take my chances there, so I do such searches from within a VM. If a knucklehead has installed a fly-by virus on a Web page, I can back out and throw away the VM without fear that my personal data is being stolen or my system compromised by a rootkit.
In sum, there are many reasons to use VMs, especially for developers. Now that today's laptops typically come with 4GB of RAM, quad cores, and huge disks, there are no good reasons for not using VMs regularly in daily work.
— Andrew Binstock, Dr. Dobb's Executive Editor