Software Performance Versus Developer Productivity
Indeed the first mainframes were programmed directly in machine code and a bit later in assembler. Severe memory limitations and simplicity of original instruction set (PDP 11 is a classical example) and relative simplicity programming tasks at hand resulted in highly efficient if not bare bone code that, unfortunately, was difficult to write. When in late 1950s computers became fast enough to relieve some of the coding burden from the shoulders of programmers high level languages were developed such as Ada, Algol, Fortran and C. While sacrificing code efficiency big time these high-level languages allowed us to write code faster and thus extract more productivity gains from computers.
As time passed we kept sacrificing software performance in favor of developer productivity gains first by adopting object-oriented languages and more recently settling with garbage-collected memory, runtime interpreted languages and 'managed' execution. It is these "developer productivity" gains that kept the pressure on hardware developers to come up with faster and faster performing processors. So one may say that part of the reason why we ended up with gigahertz-fast CPUs was "dumb" (lazy, uneducated, expensive -- pick your favorite epithet) developers. Even now major OS release (such as upcoming issue of Windows Vista) seems to be main reason for computer upgrades because new software runs slower doing the same tasks as the older software it replaces. Of course, the new software usually does much more than the old one. So an objective reason for faster computers is higher expectations and expanded feature set of the new software. After all there are some mission-critical applications that really demand performance. Database applications on server side and games on desktop size are good examples of such apps that drive hardware development towards faster performance objectively.
Still, if you look at your desktop OS now it is unbelievably bloated. For instance, when running Windows XP under normal circumstances you will easily count 50 processes, 500 threads and only about 5-10 percent CPU utilization. This is what I get when I type this article in Word 2003 running on Athlon XP 3200 under Windows XP. Thus a 10 times less powerful CPU would have equally well satisfied my requirements for browsing and typing... Yet computers in general and CPUs in particular keep getting faster and faster driven both by developer productivity needs and the requirements of mission-critical applications.
Incredibly a new factor kicked in that is threatening to curb raw clock speed increases -- runaway power consumption. It is not unusual for a modern CPU to dissipate in excess of 100 Watts, which in the case of data centers translate into tens of millions of direct power and cooling costs. So on one hand we have a habit (but rarely a need) for higher performance and on the other hand we have a looming fossil fuel crisis, global warming and rising energy prices. Shall we finally stop racing the clock speed?
Apparently, the trend for higher clock speed has already been reversed when cooler yet efficiently running AMD's Athlon processor managed to win sizeable market share from hotter and faster by clock speed Intel's Pentium 4. So how are we going to keep up with performance demands without liberal increases in CPU clock frequency?