Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Channels ▼


Parallel Evolution, Not Revolution

This guest editorial is a rebuttal to my editorial last month, in which I argued that fine-grained parallelism will never become a standard part of programming skills. –Ed.

When Herb Sutter told us in 2005 that our free lunch was gone, he helped spur a rash of predictions that parallelism would take over and serial programming heretics would soon be burned at the stake. Backed by the evidence of multicore processors becoming ubiquitous, the air of truth in urgent calls to action for parallelism led to me being quoted saying "Parallel or Perish." (It made for a very popular t-shirt.)

Seven years later, it is not unreasonable to ask whether parallel code will ever be embraced as Dr. Dobb's editor Andrew Binstock did last month. Since it seems like little has changed, we might even suggest that the earlier pronouncements were all a case of irrational exuberance. This suggestion then degenerates to the conclusion that parallelism on a universal scale is a failure and never meant to be.

Hogwash. Such seven-year-itch predictions have all the accuracy of the original sky-is-falling urgent calls to action for parallelism. It's an evolution, not a revolution. It's silently creeping over everything, and resistance is indeed futile. Relax! Let's enjoy the ride. The benefits of parallelism are everywhere, and the ability to harness it have become commonplace.

Multicore Was the Frontal Attack; the Flank Attack Was the Cloud. Battle Won.

Parallel programming is using the multiple capabilities in a computing system in harmony to get a result faster or better than if we were limited to a single computing resource. You can also keep a multicore or many-core system busy with concurrency. The hardware does not care if the threads know each other or not; concurrency versus parallel program matters not.

Multicore processors are a localized parallel computing system. The vision of connecting the whole world via the Internet gave rise to "cloud" computing. What about "grid" computing? It lost the contest for which "hyped" word will dominate, and there isn't enough difference between "grid" and "cloud" for the world to need two names. They are both forms of parallel computing. Every cloud programmer contributes to providing parallel work for hungry cores, whether through their own parallel program or concurrency.

Hold On There, Please Define "Parallel Programmer"

Parallel programming orchestrates the actions of multiple processing devices to solve a problem collectively. Compared to having one "do-it-all" processing device to solve a whole problem, several challenges emerge: sharing, decomposition, and coordination — more specifically, data sharing, problem (algorithm/program) decomposition/refactoring, and coordination (synchronization) around the sharing of data and the program.

Parallel programming means dealing with communication and computation. In recalling the "good old days" of sequential programming, our memories only recall worrying about computation. It seems that parallel programming requires communication be part of the job of programming now, and that additional work must mean it is harder.

Cloud programmers worry about communication a lot, but it is often so obvious we do not talk about it. Imagine the difference in the following pseudo-code queries to see what Apple II computers are on eBay this week:

Version 1 (c. 1996)

Display_EBay_ForSale( "Apple II" ) =>
    all_items = download( EBay );
    my_items = match( “Apple II”, all_items );
    display my_items;

Version 2 (c. 2004)

Display_EBay_ForSale( "Apple II" ) => 
    my_items = match_on_server("Apple II", EBay ); 
    display my_items;

Version 1 downloads all listings and then scans them locally; Version 2 asks the server to do the scan and only return matches. Version 2 therefore both reduces communication and offloads the work to the more powerful (and almost certainly multicore processor based systems) in the "cloud." Parallel computing is here twice: once in using the cloud, and again within the cloud in a highly parallel eBay system full of clever parallel programming ready to do the parallel computing for our two line program.

Is this client side parallelism or server side? The server operates in parallel, and the client benefits from that. It was the client programmer who chose Version 2 over Version 1. Therefore, the client programmer is a parallel programmer. Such programmers are better when they understand parallelism enough to make good choices to produce effective programs.

Is That Really Parallel Programming?

It is parallel programming — because parallel programming is just about using multiple compute devices intelligently. "Intelligently" means putting the right work at the right place and not doing anything overwhelmingly stupid with the data (such as copying the entire Library of Congress to your desktop in order to locate and print out a single sentence).

In this sense, we all need to be parallel programmers. Here is why writing that makes you a parallel programmer and makes that parallel code: Parallel programming means thinking different. "Think Parallel." The most important topics are program decomposition and data location/sharing decisions (often simply lumped together as "communication"). As soon as you decide "Display_EBay_ForSale" needs to do the query in the cloud and it needs to move only the answer to the local machine, you have made critical parallel programming decisions. Call yourself a parallel programmer with all the rights and privileges that accompany this honor.

Multicore Parallelism Accessed in Many Ways

All this talk of clouds was not meant to discount multicore parallelism. It does suggest that we need to look for more than OpenMP directives or Threading Building Block (TBB) calls to detect parallelism in our code. We need to look for decisions being made by programmers on code and data placement in a parallel world. This includes scaling up versus scaling out decisions, too.

We all become parallel programmers because, if we do not, we have a blind spot to one of the most important aspects of modern computing. We should already all be data structure programmers, and numerical programmers, and database programmers, GUI programmers, and Internet programmers. While we are likely to be experts in a narrower field, very few programmers should be able to say, "I know nothing at all about GUIs or the Internet." Would you hire a programmer today who knew nothing about the Internet? A decade ago, that was not a universal expectation, and a decade before that, the Internet was unimportant to almost all programming jobs. Likewise, text-only displays are hardly the expectation of most programmers today. Parallel programming will have this status within the decade, and already does for many developers today.

I have a love for deep, detailed parallel programming, especially the very precise wring-all-the-performance world of HPC (high-performance computing, also known as supercomputing). That makes me a pretty serious parallel programmer. Most programmers will never devote as much of their life to parallel programming. I know programmers who enjoy graphics at least as much as I enjoy parallel programming. Others feel the same about database programming, and others about writing compilers. Each is a specialty. All of us need to know enough to get what we need of these specialties.

College Students: Get Your Money's Worth

These days, if you study programming and never do a parallel program during your time in college, you need to demand you money back. Seriously.

Core concepts need to be part of every programming curriculum. Not to create deep HPC programmers out of all of us, but definitely to avoid people writing Version 1 of Display_EBay_ForSale and thinking they did their programming well.

All of us need to know this. And every program needs to be written by someone who understands these aspects, even if the decision is that a sequential program is just fine. Every once in a while, a text-based program will do. It's just much less common than twenty years ago. Similarly, parallel programs will gradually take over, but perhaps the same way GUIs did…sometimes as a wrapper and other times through a rethinking and rewrite.

Scale up or scale out, it is better when we know enough about the key concepts of parallel programming to make a decision. Relax and enjoy the ride.

Yes, I agree with Andrew's editorial…fine-grained parallel programming is a specialty, most programmers should have enough parallel/concurrency knowledge to program coarse-grained parallelism well. No one escapes entirely, but most will escape being fine-grained parallelism code jocks.

James Reinders is Intel's leading spokesperson on tools for parallelism, and author of the O'Reilly Nutshell book on Intel Threading Building Blocks.

Related Reading

More Insights

Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.