Channels ▼
RSS

Structured Patterns: An Overview



Structured Patterns: An Overview

Parallel Pattern 1: Superscalar Sequences and Task Graphs

Parallel Pattern 2: Selection

Parallel Pattern 3: Map

Parallel Pattern 4: Gather

Parallel Pattern 5: Stencil

Parallel Pattern 6: Partition

Parallel Pattern 7: Reduce

Parallel Pattern 8: Scan

Parallel Pattern 9: Pack


A high-level standard is needed for parallel programming that addresses the needs of large-scale software development. Existing standards and standard proposals do not provide the required levels of reliability and maintainability; an alternative approach is required. A top-down approach based on the composition of structured, deterministic patterns of parallelism can satisfy the needs of large-scale software development while still providing high levels of scalability and performance.

Recently there has been a move towards standards for many-core computing. However, almost all systems and standards in parallel computing to date have been very mechanism-oriented. A mechanism-oriented standard is designed bottom-up, putting a thin layer of abstraction over features that are motivated primarily by existing hardware architectures.

In contrast, our philosophy has been application-oriented. An application-oriented system considers the needs of application algorithms first and the natural ways for these to be expressed. Requirements that large-scale applications have that are not being addressed by mechanism-oriented system and standards proposals are the need for safety, structure, and determinism.

We have demonstrated that it is possible to satisfy these needs in a high-performance system. Our approach has been based on the identification of structured patterns of computation in application workloads that can be reliably mapped onto high-performance implementations by an automated system. Many of these patterns happen also to be deterministic, and their composition results in a safe and deterministic but high-performance implementation.

Programming models and platforms can be based on these patterns, and can provide a level of automated support for efficient implementation and optimization. Programming platform interfaces that can allow structured patterns of parallel computation to be expressed also allow the programming system to better capture the intent of the software developer, especially with regard to memory coherence.

This more application-oriented, structured approach to designing and implementing parallel algorithms via a supporting platform is particularly relevant for many-core processors with a large amount of parallelism. While improving the productivity of experts, specific patterns and fused combinations of patterns can also guide relatively inexperienced users to developing efficient algorithm implementations that have good scalability.

This approach to parallelism can include a unified approach to developing parallel software. Deterministic patterns can include both collective "data-parallel" patterns such as map and reduce as well as structured "task-parallel" patterns such as superscalar task graphs. The structured pattern based approach, like data-parallel models, addresses issues of both data access and parallel task distribution in a common framework. Optimization of data access is important for both many-core processors with shared memory systems and accelerators with their own memories not directly attached to the host processor.

Ultimately, the industry needs to see new or existing standards evolve to address the parallelism challenge from the application point of view. With this approach, both data and task parallel algorithms can be mapped onto patterns that can then be executed efficiently on both multi-core processors and accelerators.

We have identified on the order of a dozen patterns that are useful for deterministic parallel computing. Considering the complexity and importance of the problem, this is not actually a large set, but is too much for a single article. I will therefore be posting about each of these in turn, along with examples of their usage. I will also be discussing how these simple patterns can form the basis for a solid high-level standard for highly productive and modular parallel computing.


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.
 

Video