A Conversation with John Knoll

As a visual-effects supervisor for Industrial Light & Magic, John Knoll lives on the bleeding-edge of computer graphics. With his brother Tom, he also created the PhotoShop image-processing software.


July 01, 1998
URL:http://www.drdobbs.com/a-conversation-with-john-knoll/184410606

Dr. Dobb's Journal July 1998: A Conversation with John Knoll

Rick works for Sequoia Advanced Technologies. He can be contacted at [email protected].


As a visual effects supervisor for George Lucas' Industrial Light & Magic, John Knoll has lived on the bleeding-edge of computer graphics for over a decade. As such, he has worked on ground-breaking feature films such as The Abyss (which earned an Academy Award for Best Visual Effects), Mission Impossible, and Star Trek VIII: First Contact, among many others. He is currently working on the next Star Wars film, currently codenamed Episode I. In addition, John and his brother Tom are the creators of Adobe's PhotoShop image-processing software. John recently took time from his duties at Industrial Light & Magic in Marin County, California, to chat with Rick Tewell.

DDJ: John, from what I understand, you transitioned from model-making into computer graphics. Can you tell us about that?

JK: Sure. When I was a kid, model-making was a hobby of mine. I got to be reasonably good at it and decided to go into visual effects as a career. I moved to Los Angeles to attend the University of Southern California film program. At USC, I tried to make contacts so that when I graduated, I wouldn't be going into an entry-level position. I was trying to get some of those entry-level-position years behind me while in school. So I started doing freelance model work.

DDJ: Creatures or vehicles?

JK: Mostly the hard surface kinds of things. The first guy I worked for was Greg Jean who has a model shop. Since he runs a low-budget operation, he was happy to hire newbies and train us.

When the model was done, I'd take it out to the stage and fix things -- during rigging, they'd need a hole here or something has got to move or I had to paint something to fix it because it didn't look good enough for camera. Somebody has to be around to do those sorts of things. So I would be on the stage a lot of the time when my models would be shot, which meant I got familiar with motion-control cameras. That was something that interested me. How do you get started doing that sort of thing? They didn't teach that at USC, which was mostly a live-action school. My last year at USC, I took an advanced animation class and we had a couple of manual hand-cranked animation stands. For my final project, I decided to build a simple four-channel motion-control system. This was in 1984. I bought a used Apple II and a four-channel serial-controlled CNC milling machine, which ran four stepper motors. And I bought a bunch of surplus stepper motors from C&H Sales and various bits and pieces. Although the camera got booked in two-hour blocks during the week, it was free during the weekend. Consequently, after the last session on Friday night, I could go in there, take the handcranks off, bolt my motors on, set up the computer, and shoot as long as I had it all cleared off by the first scheduled block on Monday. It was a lot of fun.

DDJ: This was an Apple II?

JK: An Apple II Plus with a whopping 64K of RAM. I had a digital I/O board so I could control various relays.

DDJ: So primarily, you were using the Apple II to do the motion control, and the camera was just a regular film camera?

JK: Yeah. What I was shooting was slit scan. It was a process I read about and was fascinated with and I wanted to try it. You really need a computer to control that stuff.

DDJ: Did you write the software for the Apple II?

JK: Yes.

DDJ: So you were familiar with programming at that time?

JK: A little. Actually, before I started at USC (in 1980), my dad got an Apple II as part of his university research work. After dinner, he'd go work on his research but he encouraged my brother Tom and I to play with it. This was in 1978.

The wonderful thing about the Apple II was it had this Basic interpreter built into ROM, so all you had to do was turn the computer on and start typing in lines. That was a lot of fun. I feel privileged that one of my first exposures to computers was when they were so simple. There was only so much that these really primitive computers could do, so it didn't take a lot to kind of learn everything there was to know about them. As the computers became more complicated, you could learn gradually. I can only imagine what it's like to dive into what programming is like now. I've had 20 years of exposure to it. Today, it's incredibly complicated for somebody just coming out of school.

DDJ: At what point did it occur to you that the computer could actually be a tool for more than motion control or camera control -- that the computer could actually be used to generate computer images suitable for film?

JK: A lot of people saw it coming. I read about computer graphics and had friends who were members of SIGGRAPH so I saw the tapes, and was fascinated by it. It wasn't really interesting enough to me at that point in the early '80s. I thought it was neat but not ready for feature films. But then as it started getting close to being ready, I became one of the first people pushing for it. I was computer graphics designer on The Abyss [circa 1989], which was one of the first realistic pieces of computer graphics in a feature film. At least that was our intent.

DDJ: When did PhotoShop come into play?

JK: Actually, it was somewhat accidental. As I said, when I was a kid, one of my hobbies was model making. I got to be fairly good at that and it got me into the industry. But when model making turned into a profession, it sort of killed it as a hobby. It's not much fun to build models all day, then go home and build more models.

Since I was interested in motion control, I got a computer and started building motion-control systems for it. That became my new hobby. Because I knew people who were shooting motion-control elements with the models I was building, I began getting work as a camera assistant on motion-control stages. Then I got hired as a motion-control camera assistant at Industrial Light & Magic (ILM). Pretty soon I was doing motion control full time and its appeal as a hobby was greatly diminished.

I started at ILM in 1986 and had just gotten a Macintosh, my first sophisticated computer, and started writing little graphics programs as my new hobby. ILM was the first place I ever worked that had a computer-graphics department and, when I wasn't working in motion control, I'd go there to see what they were up to. They had this laser film scanner, where you could scan in a piece of negative and generate a digital image. They had the Pixar Image Computer, a nice high-quality frame buffer where you could do manipulations to a picture and film it back out. I had a demo of something so trivial now, you hardly even think of it. This guy brought up an image on the screen and simply sharpened it. That actually seemed miraculous at the time and made a huge impression on me.

About that time, my brother Tom was at the University of Michigan working on his doctoral thesis. He had pursued computer programming much more seriously; that's what he had wanted to do for his career.

He was trying to get his doctorate in computer vision and the first part of any computer vision stuff is image processing. He was doing his thesis work on a Mac Plus and writing these image-processing algorithms as MPW shell tools. That was much like how Pixar Image Computers worked. You typed in command-line arguments from a UNIX command line to run C-shell scripts from the Sun to control the frame buffer on the Pixar. That was sort of the same thing Tom was doing on his Mac.

I saw a lot of the similarities. Then the Mac II came out. It had a math coprocessor. It had color. It was faster. It had more memory. I had to have it because I thought it was so neat. When that machine first came out, displaying a color image on it from a programming standpoint was a big deal. I wasn't terribly interested in the mechanics of the palette manager, window manager, and all the things that were required to display a color picture. What I was interested in was the code that figured out how bright a pixel should be. One of the hobby things I was doing was writing a little ray tracer. Tom told me to do the math, figure out how bright the pixel ought to be, and just write it to disk as a raw image. He said I could use his tools, which could read a raw block of bytes on the disk and display it as a picture and do various transforms to it.

I did this for a while, but it was cumbersome and I thought what would be neat was if we just built the display portion of this into an application so that I wouldn't have to fire up the whole MPW thing and run the shell tools to do this. One weekend, Tom spent a few hours bundling some of those functions in to this program called "Display." Once he had that working, I started bugging him for more stuff. It was like nothing was ever good enough. So we started adding more features until it struck me that we should sell this. We could get an ad in the back of MacWorld and sell it for 50 bucks. Tom was really skeptical.

DDJ: Did you ever sell the product?

JK: No. Mostly what Display did was conversions. We had gotten it so that it could read several different image file formats. You could write several different image file formats and there were a couple of things you could do to them in the meantime. You could convert a color image to black and white.

I was completely full of naïve optimism. I showed it to a friend of mine at SuperMac, which was in alpha with a program called "PixelPaint." SuperMac was seriously considering making us an offer to bundle Display with PixelPaint as a file-format conversion utility. They had already run all their spreadsheets about how many units they thought they were going to sell of PixelPaint and what kind of deal would they want to make with us on bundling this. That added up to a number that seemed like this was worth doing.

I called Tom and said SuperMac was interested, so he scheduled two days a week to work on it full time. After two or three months, it really did a lot of things. It didn't really fit in my mind as utility any more. It was a program in its own right that wanted to be sold as its own product. One day I called Tom up and told him that I didn't think there would ever be an opportunity like this like thrown at our feet again. We just had to drop everything to make this happen.

Tom estimated he was six months from finishing his doctoral thesis. In a supreme act of faith, he stopped working on his thesis and started programming full time. We greatly underestimated how much work this was going to be. When Tom stopped school, he figured he had about six months of programming and we'd wrap up Version 1.0 of this program, and he could start next semester and finish his thesis. Meanwhile, we'd be making some money.

From the time he decided to stop school until Version 1.0 shipped was almost two years. It became much bigger than we thought it would, but it kept getting better and better. Tom is really a superb programmer. He's one of the best engineers I know. He just wrote this terrific, great code.

At the time, I moved from motion control over to computer graphics, so I was doing a lot of work on the Pixar Image Computers -- running composites and doing image-processing scripts. That drove a lot of my input as to what kind of features ought to be in PhotoShop. I would try to do more and more of my work in PhotoShop and try stuff. That's sort of how "feathering" got born. It was actually me using it for little projects that helped define the feature set.

Version 1.0 was a usable tool largely because I was trying to use it to solve real-world problems. I would run into something that would just stymie me. There's got to be a way of doing this, and then Tom would scratch his head and go, "That would be hard." He would think about it for a while. I would talk to him a few days later and he would say, "I was thinking about that and I had this great idea."

I was goading him a little bit, too. I would say, "You know what I really want to do? I want to make one of these selections so that I can like select some area and then the paint only affects just the area selected." Tom would say, "Oh, that's going to be impossible to make that go real time. It's going to be really slow." I'd say "Oh, come on, Tom. I'll bet you can do that." About a week later he would say, "I was thinking about it, and I think I've got a way." It was often a whole lot of exchanges like that where at first Tom thought it would be really hard, but he would keep thinking about it. He's brilliant that way, and he would come up with a clever solution to the problem.

DDJ: When PhotoShop was born, the industry was in some interesting transitions in computer graphics.

JK: Yes. We started on PhotoShop in September of 1987. I think 1.0 shipped in January of 1990. There was some time between when we started and when it shipped. A lot of things happened in that time. I started working in computer graphics...it wasn't until late 1988, I think. The first thing I did in computer graphics was a Pacific Bell Smart Yellow Pages commercial.

DDJ: With a Pixar?

JK: Yeah. A Pixar Image Computer is basically a frame buffer. Lucas Film Computer Division was working on what became the Pixar computer. "Pixar" adopted that name as the name of the company after George [Lucas] sold it to Steve Jobs.

DDJ: So that was something that was invented and not available anywhere else except for here?

JK: Right. We had two of them here that we used for composite work and various image-processing things. On all the old Pixar films like Andre and Wally B [circa 1984], they would render different parts of the shot as separate passes so the character in the foreground would be rendered separate from the background. Then they would composite them together, and the tool they used to do it would be the Pixar Image Computer.

DDJ: When The Abyss was created, what was the state of computer graphics?

JK: In general, no one thought of computer graphics as something you could use for real on a feature film to do something that looked realistic. The one exception was the stained-glass man [from Young Sherlock Holmes, circa 1985], which was a pretty remarkable achievement, and it's the only thing that had ever been quite like that to that point. Stuff like Last Star Fighter [circa 1984], nobody really considered realistic. But I was impressed with stained-glass man because it had things like depth of field.

Right after I started, our computer-graphics department had done this Star Trek IV [circa 1986] dream sequence with the floating heads. It didn't look very realistic. It was intended to be a stylized thing. I don't know if anybody thought that our tools in house were ready to do something super realistic.

I remember we got the storyboards on The Abyss, they were these beautiful shaded drawings. They are really fascinating. The imagery was really neat. "Wow, these are going to be really cool shots -- whoever does this and however it gets done." A lot of different approaches were being bandied about with things even as weird as stop-motion animation with clay with images of water projected onto it. Things that almost certainly never would have worked.

We had just gotten an SGI with Alias, and Jay Riddle in the computer-graphics department did a little test making some sort of a water tentacle thing. It was not a sophisticated test, but he did it really quick. He did it, I think, overnight and showed it to Jim Cameron [Titanic writer/director] the next day. Jim was really surprised how quickly that had been done because the reputation was that computer graphics was really, really slow and very expensive and the complete antithesis of interactivity. You'd talk to these guys and they'd disappear for months, and then they would come back with something you didn't want. "I want it to be more like..." "Well that will be another six months."

DDJ: But they felt this was an intricate part of the film?

JK: Jim's position was that if the water tentacle sequence -- while it was a bold thing to attempt -- didn't work or ended up looking terrible, he could cut it out of the movie and he could still make the movie. He wasn't hinging the success of this picture on this effect working. It was only like 25 shots. This seemed like a huge number to us at the time, but it is hardly anything now. So we started this R&D project to do this thing, and we wrote a bunch of new software to do it. We switched over from Rays to RenderMan, which Pixar had just gotten going.

DDJ: There was nothing on the street that could do this at the time?

JK: No. We used the RenderMan renderer but we wrote custom shaders to do the fake refraction and get the right amount of reflection for fog and that sort of thing. We had to write the software for it to do the rippling of the surface and to "skin" it. The way it was actually done was, we animated a spline in space -- a 3D path -- and we had a bunch of cross sections. They were animated separately, so it was just a bunch of circles, and we scaled them. And then, there was a piece of software called "Skin" that would take all of the circular cross sections and place them perpendicular to the spine at particular points and skin the surface.

Then there was another program that would let you place a bunch of 3D noise generators in the world, and it would take the patches and subdivide them into smaller patches and perturb all the control vertices according to the sums of all the sine waves from 3D noise generators. So the model was created new per frame based on this program, so some work was involved. How do you do motion blurs when you're actually just changing the model from frame to frame without taking one model and moving it? Some hacks were made. Actually it's the same model, but what we're doing is we're moving these vertices from here to here. You would write two-root files. They contain all the same CVs [control vertices] and then there was a script called JR2R that would take the two-root files and make them look like it was one model just moving from this frame to that frame.

DDJ: Then comes Terminator 2 [circa 1991], which has something (not quite like the water tentacle) but it has the Mercury guy and that was from James Cameron.

JK: Yeah. Jim said it was a big gamble. If it didn't work, he could always cut it out of the picture, but based on his experience on The Abyss, he went much bolder on Terminator 2 with making a character that had to be done with computer graphics. And it had to work because if you cut that out of the movie, you've got nothing left. All the things of being able to change shape from this to that and to melt and then reform itself. Well, the effect has to work or you don't have a movie. Yeah, it was a sign of his faith in the technology.

DDJ: In Jumanji [circa 1995] we have the first computer graphics hair that actually flows and moves, and the depth is there, and it is so stunningly realistic that it was actually an amazing achievement for computer graphics. Did that require custom tools or was there a point where you could actually use off-the-shelf components to actually do this?

JK: We try to use off-the-shelf software wherever we can, but a lot of things we're called upon to do just can't be done with off-the-shelf software. So we have a pretty good size software-development staff just to develop these tools; otherwise, we would just have to say, "No, we can't do that."

DDJ: Do you still do that today?

JK: Yeah.

DDJ: Do producers come in and say, "We want to produce a film and here are the special effects that we want" and you just go, "I don't think so."

JK: Well, no. We gulp and say, "Okay, we can do that. Here's the budget." Then they gulp.

You can usually spend your way out of just about any hole there is. If you put enough time and man hours into something, there's usually a way to do it and I can think of very few exceptions where we just have to give up and say, "No, that just can't be done." There are some things that would be extremely difficult and we could never do realistically, at least not yet. But most of the things we're asked to do are at least within some amount of R&D of what we're capable of. George [Lucas], on this new Star Wars picture, wrote a lot of things into the script without worrying about how the hell are we going to do this. He just writes things he thinks are neat.

DDJ: Martin Hash has created a product called Animation Master and is trying to make a film, Telepresence, for $2 million which positively could not be made for $2 million if a studio did it based on the effects he wants to put in there.

Do you see a trend coming where independent filmmakers can use off-the-shelf components to actually have "big budget" special effects in films? Up to now, independent films have been pretty much lacking special effects that are just sort of character driven.

JK: It's already happening. A bunch of friends of mine are starting up these garage operations -- little one-man digital facilities -- and they do things for TV shows or low-budget features. They're able to do the kind of work now just at home with PCs. It used to be that you had to have the whole full-blown production mechanism here for it, and now you can do some pretty good looking stuff.

DDJ: Like Electric Image?

JK: Yeah. With Electric Image, After Effects, and PhotoShop, you've got a little production facility there.

DDJ: Speaking to a peer programming audience, what do you see as the next generation of products for computer graphics?

JK: Well, I don't think there are any real specifics that are easy to predict. But I think the general trend is to try and eliminate as much machinery between the artist and the art as possible.

One of the things that has been really liberating about moving to digital-production techniques is that it used to be that huge amounts of effort went into just the mechanics of not getting the matte line or not getting the wrong color in a shot, for instance. That's where a lot of your energy went -- just trying to get rid of the really obvious things. Now, you can take that stuff more for granted. Today, an artist spends more time working on the aspects of the work that make the shot look good or not look good, and not so much on the mechanical. I see that trend continuing.

Right now, my biggest complaint about the way that a lot of these digital tools work is that they're still kind of awkward, and the artist spends too much time working on things that have nothing to do with the shot looking good or not. It's editing exclusion lists and making sure your aliases are pointing to the right directories. There's a lot of machinery that the artist still has to deal with that, as software gets better, they're going to spend less of their time of doing and more of their time focused on the real art of it.

DDJ: What about these new digital interfaces like FireWire? Do you see that again liberating artists so that digital images can go straight into the machine?

JK: I think that all these technologies like this are wonderful. I spend a lot of my time living on the bleeding edge, where we're just trying to get something done almost no matter how painful it is. We work with these kind of kludgy custom-written things that just barely work well enough to get through the shot or you really wouldn't want to do that a whole lot more. And what happens is that like five years down the line, the commercial applications end up with a lot of functionality that we have very painstakingly hand crafted -- like morphing, for example.

Back on Willow [circa 1988], Doug Smythe spent time writing the first morphing program that worked well for what we did and let us do these shots that were sort of impossible otherwise. We made good use of it. I used it on The Abyss to actually do the face animation with morphing. We used it on Terminator 2. Then Elastic Reality hit the market and once that capability was present in the commercial program, it was at least as good as our morph program. In some ways, it was better, and there was no reason to keep working on our program.

A commercial application now had the same functionality, and you could buy it for nothing. That's a good example of something that we sort of suffer through getting the first version, and then people see the results of that, and they go, "Oh, man, I want this." So a bunch of commercial developers can jump in and say, "Well we can provide that." They write a good interface on it, on something that's actually debugged with appropriate error messages and all those kinds of things that commercial software brings to the equation. And then it's available to everybody.

DDJ


Copyright © 1998, Dr. Dobb's Journal

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.