Our Pattern Language
Software architecture defines the components that make up a software system, the roles played by those components, and how they interact. Good software architecture makes design choices explicit, and the critical issues addressed by a solution clear. A software architecture is hierarchical rather than monolithic. It lets the designer localize problems and define design elements that can be reused in other architectures.
The goal of Our Pattern Language (OPL) is to encompass the complete architecture of an application from the structural patterns (also known as "architectural styles") that define the overall organization of an application [8, 9] to the basic computational patterns (also known as computational motifs) for each stage of the problem [10, 1], to the low-level details of the parallel algorithm . With such a broad scope, organizing our design patterns into a coherent pattern language was extremely challenging.
Our approach is to use a layered hierarchy of patterns. Each level in the hierarchy addresses a portion of the design problem. While a designer may in some cases work through the layers of our hierarchy in order, it is important to appreciate that many design problems do not lend themselves to a top-down or bottom-up analysis. In many cases, the pathway through our patterns will be to bounce around between layers with the designer working at whichever layer is most productive at a given time (so called, "opportunistic refinement"). In other words, while we use a fixed layered approach to organize our patterns into OPL, we expect designers will work though the pattern language in many different ways. This flexibility is an essential feature of design pattern languages.
In Figure 1, we organize OPL into five major categories of patterns. Categories 1 and 2 sit at the same level of the hierarchy and cooperate to create one layer of the software architecture.
- Structural patterns: Structural patterns describe the overall organization of the application and the way the computational elements that make up the application interact. These patterns are closely related to the architectural styles discussed in . Informally, these patterns correspond to the "boxes and arrows" an architect draws to describe the overall organization of an application. An example of a structural pattern is Pipe-and-Filter, described in the sidebar.
Structural Pattern: Pipe-and-Filter
Solution: Structure an application as a fixed sequence of filters that take input data from preceding filters, carry out computations on that data, and then pass the output to the next filter. The filters are side-effect free; i.e., the result of their action is only to transform input data into output data. Concurrency emerges as multiple blocks of data move through the Pipe-and-Filter system so that multiple filters are active at one time.
- Computational patterns: These patterns describe the classes of computations that make up the application. They are essentially the 13 motifs made famous in  but described more precisely as patterns rather than simply computational families. These patterns can be viewed as defining the "computations occurring in the boxes" defined by the structural patterns. A good example is the Dense-Linear-Algebra pattern described in an earlier sidebar. Note that some of these patterns (such as Graph-Algorithms or N-Body-Methods) define complicated design problems in their own right and serve as entry points into smaller design pattern languages focused on a specific class of computations. This is yet another example of the hierarchical nature of the software design problem.
In OPL, the top two categories, the structural and computational patterns, are placed side-by-side with connecting arrows. This shows the tight coupling between these patterns and the iterative nature of how a designer works with them. In other words, a designer thinks about his or her problem, chooses a structural pattern, and then considers the computational patterns required to solve the problem. The selection of computational patterns may suggest a different overall structure for the architecture and may force a reconsideration of the appropriate structural patterns. This process, moving between structural and computational patterns, continues until the designer settles on a high-level design for the problem.
Structural and computational patterns are used in both serial and parallel programs. Ideally, the designer working at this level, even for a parallel program, will not need to focus on parallel computing issues. For the remaining layers of the pattern language, parallel programming is a primary concern.
Parallel programming is the art of using concurrency in a problem to make the problem run to completion in less time. We divide the parallel design process into the following three layers:
- Concurrent algorithm strategies: these patterns define high-level strategies to exploit concurrency in a computation for execution on a parallel computer. They address the different ways concurrency is naturally expressed within a problem by providing well-known techniques to exploit that concurrency. A good example of an algorithm strategy pattern is the Data-Parallelism pattern.
Concurrent Algorithm Strategy Pattern: Data-Parallelism
Solution: An algorithm is organized as operations applied concurrently to the elements of a set of data structures. The concurrency is in the data. This pattern can be generalized by defining an index space. The data structures within a problem are aligned to this index space and concurrency is introduced by applying a stream of operations for each point in the index space.
- Implementation strategies: These are the structures that are realized in source code to support (a) how the program itself is organized and (b) common data structures specific to parallel programming. The Loop-Parallel pattern is a well-known example of an implementation strategy pattern.
Implementation Strategy Pattern: Loop-Parallel
Solution: An algorithm is implemented as loops (or nested loops) that execute in parallel. The challenge is to transform the loops so that iterations can safely execute concurrently and in any order. Ideally, this leads to a single source code tree that generates a serial program (by using a serial compiler) or a parallel program (by using compilers that understand the parallel loop constructs).
- Parallel execution patterns: these are the approaches used to support the execution of a parallel algorithm. This includes (a) strategies that advance a program counter and (b) basic building blocks to support the coordination of concurrent tasks. The single instruction multiple data (SIMD) pattern is a good example of a parallel execution pattern.
Parallel Execution Pattern: SIMD
Solution: An implementation of a strictly data parallel algorithm is mapped onto a platform that executes a single sequence of operations applied uniformly to a collection of data elements. The instructions execute in lockstep by a set of processing elements but on their own streams of data. SIMD programs use specialized data structure, data alignment operations, and collective operations to extend this pattern to a wider range of data parallel problems.
Patterns in these three lower layers are tightly coupled. For example, software designs using the Recursive-Splitting algorithm strategy often utilize a Fork/Join implementation strategy pattern which is typically supported at the execution level with the thread-pool pattern. These connections between patterns are a key point in the text of the patterns.
OPL draws from a long history of research on software design. The structural patterns of Category 1 are largely taken from the work of Garlan and Shaw on architectural styles [8, 9]. at these architectural styles could also be viewed as design patterns was quickly recognized by Buschmann . We added two structural patterns that have their roots in parallel computing to Garlan and Shaw's architectural styles: Map-Reduce, influenced by  and Iterative-Refinement, influenced by Valiant's bulk-synchronous-processing pattern . The computation patterns of Category 2 were first presented as "dwarfs" in  and their role as computational patterns was only identified later . The identification of these computational patterns in turn owes a debt to Phil Colella's unpublished work on the "Seven Dwarfs of Parallel Computing." The lower three categories within OPL build on earlier and more traditional patterns for parallel algorithms by Mattson, Sanders, and Massingill . This work was somewhat inspired by Gamma's success in using design patterns for object-oriented programming . Of course all work on design patterns has its roots in Alexander's ground-breaking work identifying design patterns in civil architecture .