Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

Database

Motion Blur Effects


Dr. Dobb's Journal July 1997: Motion Blur Effects

Tim is a systems architect at the Star Tribune in Minneapolis and author of Visual Special Effects Toolkit in C++ (John Wiley & Sons). He can be contacted at [email protected].


Most of us have taken pictures that ended up looking blurry. Usually, the cause of the blur is the subject moving through the camera's field of view, the camera shutter being set on a long exposure, the lens aperture not letting in enough light, or the film's ISO rating (film speed) being too low.

Photographers know that the rate of motion a film is capable of capturing without blur depends on a combination of factors within the camera, such as those just mentioned. The same phenomenon occurs in operating a video camera. The video circuitry, regardless of the underlying technology being used, also has an effective exposure. The human eye also has an effective exposure. The exposure characteristics of the sensor (a human eye or camera, for instance) place a limit on the rate of motion that can be captured by that sensor without blur. Stated simply, when an object's rate of motion exceeds the exposure limit of the sensor, blur results.

In contrast, computer-generated imagery techniques have no exposure limit associated with them. In effect, the computer's "camera" has an infinitely short exposure, while film, video, and the human eye do not. Consequently, rapidly moving objects generated by computers during special-effect sequences can appear "too crisp" if motion blur is not properly added to the sequence. In this article, I'll present a method for blurring the appearance of moving objects. This method is applied after a sequence of images has been generated. The approach essentially involves temporal averaging over the sequence of output frames. That is, each pixel in the blurred output image is the average of a number of corresponding pixels from surrounding frames. The number of surrounding frames is called the "blur depth." Smaller amounts of blurring can be produced by keeping the blur depth small, say, a value of 2 frames. Larger amounts of blurring can be produced by increasing the blur depth. Figure 1 illustrates how this method actually works. The sequence of images produced during animation are on the left side of Figure 1, labeled "Original Sequence," and numbered 1 through 7.

The blur depth indicates how many input frames are to be averaged for each blurred output frame in the sequence. In Figure 1, a blur depth of 2 is used. The illustration shows that a blurring window is passed over the sequence of input frames. Each blurred output frame is calculated from 4 input frames. The first blurred frame is the average of four frames in the input sequence. The second blurred frame is the average of input frames 2 through 5. Figure 2 shows an example sequence with a blur depth of 4.

Using the method just described, you can create blurred sequences in which the amount of blur in each frame is an even multiple of the original frame rate. In other words, creating two blurred sequences in which the blur window is 4 and 6, respectively, creates sequences in which the amount of motion contributing to the blur is 4/30 and 6/30, respectively. This method is capable of varying the amount of motion contributing to blur only in whole increments of the original frame rate. Next, I'll present an enhancement that provides more flexibility over the amount of blurring that can be added to each frame in a sequence.

Controlling the Amount of Blur

The following technique provides greater flexibility in the amount of blur that can be added to computer-graphic imagery models that are composited into a special-effects scene. This additional control can be provided by oversampling the original sequence. Instead of producing 30 frames per second, the same amount of motion can be produced in, say, 150 frames per second -- 5 times the original rate. This means that each frame in the oversampled sequence represents 1/5 of the motion contained in each frame of the original sequence. By varying the blur-window setting and applying the blurring operation to the oversampled sequence, each resulting blurred frame can be comprised of varying fractions of the original rate of motion.

Suppose, continuing with the example, that blurring is now applied to the oversampled sequence with a blur window of 7 frames. Each frame in the blurred sequence now contains 7/5 the original rate of motion. What is needed next is a way to get back to the original frame rate of 30 frames per second. The original frame rate can be achieved by undersampling the blurred sequence appropriately. If every fifth frame is now taken from the blurred sequence, the result is a sequence in which the rate of motion is again equal to the rate of motion in the original sequence -- 30 frames per second. However, the amount of blur in each frame corresponds to 7/5 the rate of motion in each frame of the original sequence. From this example, it is apparent that the rate of motion blur in each final frame relative to the original is a fraction in which the denominator is controlled by the oversampling rate and the numerator is controlled by the blur depth. Figure 3 illustrates these relationships.

Software Corner: Blurring Sequences

Listing One s the motionBlur function, which implements the blurring procedure in Figure 1. This function takes an image sequence as input and applies the blurring algorithm as previously described. A blurred output sequence of images is produced. The firstImagePath is a pathname to the first image in a sequence of images that is to be blurred. OutputDir is an absolute pathname to the output directory into which the blurred images are to be placed. The numFrames argument indicates the number of frames in the image sequence to blur. The value of blurDepth specifies the number of images that are to participate in the blurring operation.


Listing One

#define MAXIMAGESint motionBlur(char *firstImagePath, char *outputDir, int
 numFrames, int blurDepth){
  memImage *images[MAXIMAGES];
  char directory[MAXPATH],fileName[MAXPATH],prefix[MAXPATH],
    inSuffix[MAXPATH];
  char currentPath[MAXPATH], inPath[MAXPATH];
  int frameNum, i, j, status;
 if(blurDepth > MAXIMAGES){
    statusPrint("motionBlur: blurDepth is larger than the limit.");
    return -1;
  }
  // the directory includes the drive letter
  status = getPathPieces(firstImagePath, directory, fileName, prefix, 
                       &frameNum, inSuffix);
  if(status != 0){
    statusPrint("motionBlur: Check the first image pathname");
    return -2;
  }
  int imHeight, imWidth, bpp, frameCounter, row, col;
  status = readBMPHeader(firstImagePath, &imHeight, &imWidth, &bpp);
  if(status != 0){
    sprintf(g_msgText, "motionBlur: Cannot open: %s", firstImagePath);
    statusPrint(g_msgText);
    return -3;
  }
  for (frameCounter=frameNum;frameCounter<=frameNum+numFrames;frameCounter++){
    //  Open and close the appropriate images
    if(frameCounter == frameNum){
      for(i = 0; i < blurDepth; i++){ 
        makePath(currentPath, directory, prefix, 
          frameCounter + i, inSuffix);
        images[i] = 
          new memImage(currentPath, 0, 0, RANDOM, 
            'R', RGBCOLOR);
        if(!images[i]->isValid()){
            sprintf(g_msgText, 
             "motionBlur: unable to open image: %s",
              currentPath);
            statusPrint(g_msgText);
            return -4;
        }
      }
    }
    else{
        delete images[0];               //close oldest image
        for (j = 0; j < numFrames; j++)
            images[j] = images[j+1];
                                        //open new image
        makePath(currentPath, directory, prefix, 
          frameCounter + blurDepth-1, inSuffix);
        images[blurDepth-1] = new memImage(currentPath, 0,
          0,RANDOM, 'R', RGBCOLOR);
}
//  blur the images
    char outPath[MAXPATH], outSuffix[MAXPATH];
    memImage *outImage;
    int blur;
    sprintf(outSuffix, "%s\0","b");
    makePath(outPath, outputDir, prefix, frameCounter,
     outSuffix);
    outImage = new memImage(imHeight, imWidth, bpp);
    BYTE red, green, blue;
    for (row = 1; row < imHeight; row++){
     for (col = 1; col < imWidth; col++){
        int bucket = 0;
        int redBucket = 0;
        int greenBucket = 0;
        int blueBucket = 0;
        for (blur = 0; blur < blurDepth; blur++){
          switch (bpp){
            case 8:
            bucket += images[blur]->getMPixel(col, row);
            break;
            case 24:
            images[blur]->getMPixelRGB(col, row, &red, &green,
             &blue);
            redBucket += red;
            greenBucket += green;
            blueBucket += blue;
            break;
            default:
            break;
          }  //end switch
          if(bpp == 8){
            float avgBucket = bucket/blurDepth;
            outImage->setMPixel(col, row, (BYTE)(avgBucket +
             0.5));
          }
          if(bpp = 24){
            float avgRedBucket = redBucket/blurDepth;
            float avgGreenBucket = greenBucket/blurDepth;
            float avgBlueBucket = blueBucket/blurDepth;
            outImage->setMPixelRGB(col, row, 
            (BYTE)(avgRedBucket + 0.5),
            (BYTE)(avgGreenBucket + 0.5),
            (BYTE)(avgBlueBucket + 0.5));
          }
        }    //end inner loop
      }      //end outer loop
    }        //end frame loop
//  Save the blurred image
    sprintf(g_msgText,"Saving: %s", outPath);
    statusPrint(g_msgText);
    outImage->writeBMP(outPath);
    delete outImage;
}   //end sequence loop;
//  Close the remaining images
for(i = 0; i < blurDepth; i++)
  delete images[i];
return NULL;
}

Back to Article

DDJ


Copyright © 1997, Dr. Dobb's Journal


Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.