Read James Reinders' counter to this article. |
Now, I'm not saying that seven years is the basic cycle in softwareindeed, if anyone has a scientific study about that topic, I'm eager to read it. The only support I've found for my thesis is a paper entitled "Fundamental Stocks of Knowledge and Productivity Growth" (Adams, The Journal of Political Economy, August 1990): "Knowledge is found to be a major contributor to productivity growth. Furthermore, a lag in effect of roughly 20 years is found between the appearance of research in the academic community and its effect on productivity in the form of knowledge absorbed by an industry. Academic technology and academic science filtered through inter-industry spillovers exhibit lags of roughly 10 and 30 years each. Thus implied search and gestation times far exceed developmental periods in studies of R & D."
So, where are we on the concurrency curve? Have there been significant, paradigm-shifting technology introductions every seven years in the parallel programming world, or has too much time gone by while academia and commerce went separate ways? Do OpenMP and Intel's Threading Building Blocks represent significant technological evolution, in your opinion? Granted, these are 20-20 hindsight questionswe won't know some of the answers until clear winners emerge in the tools space. But am I the only one surprised that Amdahl and Gustafson and their respective laws (really two sides of the same coin, as James observes) are frequent topics of conversation as developers debate the merits of various approaches to concurrent programmingor concurrency itself?
"History shows that the commercial world is very slow to accommodate academic advances," says author, developer and blogger Larry O'Brien. "Look at OOP: You had Simula 67, but it was not mainstreamed until C++ 20 years later. Programming theory on concurrencedataflow, functional, message-passingall this stuff goes back at least to the 70s and 60s. The 80s saw tons of work on business rules and knowledge representation that is virtually unused today." O'Brien and other members of the software cognoscenti agree that concurrency is the next big wave. Will the masses follow?
|
This month, James suggests that, in keeping with Gustafson's wisdom, we embrace larger problems and understand the flaw in Amdahl's logic that the serial component of a program distributed across multiple cores will always disproportionately throttle linear speedups in performance. Apparently, however, there are plenty of Amdahl acolytes out there; it may take more than this humble series of articles to sway a generation of developers.
Several experts, O'Brien included, have described threading maturity models (Alan Zeichick started the trend, followed by David Dossot and Larry O'Brien). O'Brien does a nice job of naming some of the basic techniques developers need to grasp as their concurrency skills and experience grow. Casual threaders (stage 2 of 5), in his hierarchy, exhibit "Conscious use of high-level abstractions (components, libraries) to solve specific problems" such as heavy-duty calculations or IO operations; flexible threaders (stage 4) "attempt to maximize use of cores, use of lock-free algorithms and data structures" and are beginning to switch over to "hardware-based thinking."
Unfortunately, until we begin identifying design patterns where concurrency makes the most sense, simply exhorting developers to solve bigger problems will only move that small, eager camp of edge-bleeders. The mainstreamers and die-hards won't dive in until there are problems they can't solve any other way. As Herb Sutter so eloquently put it, the free lunch is overbut too many developers are still hoping for leftover pizza.
James' 200 Word Response:
Applications shipping this year will run on many types of systems, including quad-core processors. Assuming, conservatively, a doubling of cores every two years, there will be 32-core machines in five years (the quad-core came out late year). Does all software being shipped today fully take advantage of the quad-core? Of course notsome does and some does not. For many machines, only the collective of software running on the quad-core justifies it. Assuming that in 2011 some users will still use software released in 2007, how will these programs run on a 32-core box? This isn't far-fetched it is the same as using Office 2003 next yearmany people will be. The question is not "Are you ready for the future?" it's "Are you ready for today, including the 32-core machines our software will run on in only five years?"