Software Development Goes to the Movies

Solving software problems for moviemaking has paid off for software development in general.


April 11, 2008
URL:http://www.drdobbs.com/global-developer/software-development-goes-to-the-movies/207200079

The relationship between cinema and technology is a complex and fascinating one.

—from the Google review of Technology and Culture, the Film Reader, by Andrew Utterson

As director of software development at Digital Productions, Larry Yaeger produced the first photo-realistic computer graphics in a feature film—1984's The Last Starfighter. This may have been one of the first explicit indications that software development was becoming an integral part of movie production.

This year's Academy Awards even more explicitly recognized the work of software development in movies, and specifically of some code crunching water wizards at Stanford University and Industrial Light and Magic in solving some tough problems in fluid dynamics—problems posed by the movie Pirates of the Caribbean 3: At World's End.

Nick Rasmussen, Ron Fedkiw, and Frank Losasso Petterson were recognized for the development of the ILM fluid simulation system. "This production-proven simulation system," the Academy said, "achieves large-scale water effects within ILM's Zeno framework. It includes integrating particle level sets, parallel computation, and tools that enable the artistic direction of the results." Not explicitly acknowledged in the award was PhysBAM (www.physbam.com.futuresite.register.com), the C++ library for computational fluid dynamics problems, of which Fedkiw is one of the developers.

Water is one of the toughest things to simulate in movies, a fact well documented in—what else—a movie called Perfect Water (www.pbs.org/kcet/wiredscience/video/310-perfect_water.html). Water is computationally expensive, which is why, when you want to simulate a maelstrom, it takes a render farm—and the right equations. But if you get it right, maybe you get an Oscar. And maybe something more.

Yeah, I'm a legend.

—Ellen Page

So right now, you are probably doing some creative visualization of your own...

The scenario: Michael Cera and Ellen Page awkwardly read from the teleprompter, fumble with the envelope, and finally announce the Academy Award for Best Software Development Effort in a Motion Picture. Running the gauntlet of hugs and kisses, backslaps, and high-fives like Stephen Colbert welcoming a guest, you mount the stage, step up to the microphone, and humbly but eloquently accept your Oscar, thanking the Academy, your mother, and Donald Knuth.

It's pure fantasy, of course. Not gonna happen. I'm not suggesting that you don't richly deserve the accolade, and the Academy does give out Oscars for technical achievements. They just don't give them out during the big ceremony with the red carpet where Jack Nicholson slumps in the front row like it's a Lakers game; no, you'll get your technical Academy award the week before in a Motel 6 in Oxnard.

But if those who write the software behind the 21st century's movies don't get all the glitter and glitz, there are other rewards. Solving software problems for moviemaking can pay off for software development generally, and in less "frivolous" applications.

But let's step back a second to note what we're not talking about here. It is true that all manner of cutting-edge technology is crucial to moviemaking today: at Sundance this year, according to CNet's Michelle Meyers, "indies and techies [were] one and the same." But technology in the movies is nothing new: Movies are technology, and the Academy has been recognizing technology at the Oscars almost since there have been movies. Just after World War II, the technology of movies seemed to ratchet up a notch, with innovations like the Acme Tool optical printer for manipulating film, and blue screen technology.

But no software. Until the 1970s, technological advances in movies generally didn't challenge the assumption that movies were made using cameras, microphones, cranes, and actors. Technology for special effects basically involved manipulations in front of the camera—to make fake clouds and fog and moving water and falling snow, that were then photographed. The parting of the Red Sea in 1956's The Ten Commandments? Water pouring out of tanks, photographed conventionally and then run backwards. The artful special effects in Kubrick's 1968 science fiction masterpiece 2001? All done without benefit of computers. There was essentially no software development involved in the making of movies until the 1970s.

Movies Get Softer

It's just a movie, Ingrid.

—Alfred Hitchcock

In the past 30 years, software developers have changed how movies are made. Some notable examples:

But this list is getting us nowhere. These are almost random selections from a list too long to attempt, a list that would include Renderman, Maya, and many other software advances.

Visualization In Movies and In Science

In the movies, you get spaceships and planets. But those same techniques applied to scientific data yields scientific visualizations that let researchers see things they never could have before.

—Larry Yaeger

Is there a pattern to these software innovations in moviemaking? Yes. Many of the technologies come under the heading of fundamental processes of synthesizing images, or visualization. And the connection between movie special effects and scientific visualization was there right from the start.

In the early 1970s, Bernard Chern at NSF launched a program to support work on computer systems for modeling objects in three dimensions. This was a discipline in which, according to Herbert Volcker, who started the computer model program at the University of Rochester, "there were no mathematical and computational means for describing mechanical parts unambiguously... There were no accepted scientific foundations, almost no literature, and no acknowledged community of scholars and researchers." Under the impetus of the NSF (and maybe the hope of winning an Oscar), this situation was changing rapidly. By the mid-1980s, NSF funded four organizations specifically to help scientists visualize data. One of these organizations was Digital Productions, a company better known for producing special effects for television and the movies. But under NSF encouragement, DP became responsible for some of the first really good three-dimensional visualizations of scientific data.

"Everything from the world of movie special effects was pressed into service," according to Yaeger, "hidden surface removal, lighting and shading, texture mapping, transparency, bump mapping, you name it."

Visualization was changing the way scientists thought about their work. "Scientists started being able to see the output of their simulations of galaxy formations, black holes, and the like," Yaeger says. And just as movie techniques were being adopted in scientific visualization, techniques from scientific visualization were feeding back to the movies. "[C]omputational fluid dynamics, such as drove those scientific simulations, drove the motion of the atmosphere in the planet Jupiter seen in the movie 2010." (Yaeger, who is one of the leading researchers in Artificial Life and teaches at Indiana University, worked on 2010, and was later the technical consultant on Terminator 2.)

By 1991, the field of computer visualization was exploding. "The field had gotten so big, with so many specialties, that no one could know it all. No single research lab could do it all. Graphics hadn't just become broad—it was increasingly interdisciplinary," explains Andries van Dam, director of NSF's Science and Technology Center for Computer Graphics and Scientific Visualization.

Movies As Science

In the past, I have always thought of visualization as primarily a mental process: You receive some knowledge (from any of various sources) and, when you understand it thoroughly, you can 'create a picture of it' in your mind. Nowadays, computer graphicists are trying to place this picture more directly in the mind by creating the pictures with a computer.

—Jim Blinn

It is not simply the fact that scientific side effects can result from solving the technical problems involved in pushing the state of the art in movie special effects and animation. The problems in such movies are often the same as the problems in computer visualization. In fact, there is now the notion of movies as science:

In the early 1980s, JPL decided, for purely scientific reasons, to produce LA—the Movie, a fly-over of Southern California, based on multispectral satellite image data.

By 1990, it was becoming clear what one use of supercomputers would be: Crunching numbers to produce scientific cinema. From a 1990 supercomputing conference paper: "The collapse of an unstable cluster to a supermassive black hole at the center of a galaxy may explain the origin of quasars and active galactic nuclei. By means of a supercomputer simulation and color graphics, the whole process can be viewed in real time on a movie screen."

And then there's Jim Blinn. Blinn did the graphics for Carl Sagan's Cosmos and much-viewed simulations of Voyager visiting Jupiter and Saturn.

But Blinn is also known for fundamental work in scientific visualization. He came up with new methods to represent the interaction of objects and light in a three-dimensional virtual world. He is now a graphics fellow at Microsoft Research. In the words of Alvy Ray Smith, "Jim is one of the pioneers...everything he did helped establish the field as we know it today."

Bump mapping is one of the methods Blinn developed. "You can emboss a surface or give it a texture like leather or what not," he explains at the Microsoft site. " It's the sort of thing that shows up on the skins of dinosaurs in Jurassic Park." Also from the Microsoft site: "Blinn's early work in computer graphics drew the attention of several Hollywood movie producers, who began calling Blinn to request demonstrations of the special effects he was incorporating into his videos." Blinn: "A lot of people went out and started their own computer graphics groups and special effects houses after seeing the demos I'd done."

Visualization is at the heart of modern moviemaking. It also allows scientists to investigate fields where other research techniques fail to deliver useful insight. "One of the early visualization success stories," according to an NSF website on the history of scientific visualization, "was a model of smog spreading over southern California, a model so informative and realistic that it helped to influence antipollution legislation in the state."

Being able to create a perfect storm for a movie is cool. Being able to perfectly model a real storm, that's important. The fact that the same research might solve both problems? Brilliant.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.