Dr. Dobb's is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Channels ▼
RSS

.NET

A Windows 3D Model Viewerfor OpenGL


Dr. Dobb's Journal July 1998: A Windows 3D Model Viewerfor OpenGL

Jawed studies computer science at the University of Illinois at Urbana-Champaign. He works part-time at the National Center for Supercomputing Applications, and can be contacted at [email protected].


OpenGL is known in the UNIX world as the 3D API behind high-powered scientific applications. It has recently gained attention in the PC sector, thanks to the computer-game industry, which has embraced OpenGL as an API standard for 3D game programming. Furthermore, 3D hardware acceleration for PCs has extended the range of applications for OpenGL even further.

The OpenGL API is intuitive, easier to use, in my opinion, than Microsoft's Direct3D API, and is portable among platforms. In this article, I'll present a model viewer for use with OpenGL on Windows 95/NT. First, however, I'll describe the important parts of a Quake2 model viewer -- an OpenGL-based system written in C/C++ -- that displays wire-frame and texture-mapped models (see Figure 1) from Quake2 and provides a basic interface to modify their appearance. In the process, I'll focus on file formats (MD2 files for models, and PCX files for textures), passing the data contained in the files to OpenGL for rendering, and interfacing Win32 with OpenGL using an API called "WGL." The archive Q2M-SRC.ZIP contains the Quake2 Model Viewer source code, while Q2M-BIN.ZIP is the Quake2 Model Viewer EXE file. Both are available electronically; see "Resource Center," page 3.

Reading the MD2 File Format

The only official source of information about Quake2's MD2 format is code by John Carmack of id Software; this code writes 3D polygon mesh data to an MD2 file (available at ftp://ftp.idsoftware.com/). Anyone who has looked at this source code will notice that some of the structs in Quake2 Model Viewer's md2.h (available electronically) are derived from it. Writing the MD2 reader basically involves converting John's code from reading MD2 files to writing them. Figure 2 illustrates the binary structure of an MD2 file.

To display the textured Quake2 models, four specific types of information are needed (see Figure 3):

  • 3D vertex coordinates.
  • A list of triangles consisting of those vertices.
  • 2D texture vertex coordinates (one for each 3D vertex).
  • The texture image.

All of the 3D vertices in the model are stored in one array. When the triangles (which are made up of those vertices) are defined, all that has to be stored for each vertex of a triangle is an index number to the big vertex array. The reason for this is simple: Since many of the vertices are shared between triangles, storing each vertex once saves memory. In addition, linear transformations can be performed on the entire array at once, thereby speeding rendering time. Since the texture image itself is not a part of the MD2 file, it can be read in from a conventional PCX file.

Before starting, you must know how much data to expect. The file's header section tells you the number of vertices, triangles, and texture coordinates contained in the file. Knowing when to stop, you can go into a loop and read the information in chunks. To store all the data, use the vertex structure in Listing One

Each triangle is defined by its corners, a, b, and c. These values are indices to an array of type make_vertex_list, which is a list of all 3D vertices in the entire model. The remaining six integers represent the 2D texture coordinates for every vertex. Listing Two is an example of a structure for holding this data. Using such a structure, the coordinates of the three vertices of the first triangle in the model can be referenced (see Listing Three).

In a Quake2 model, the only things that differ from one frame to the next are the 3D coordinates of the triangle vertices; the vertex indices and texture coordinates remain the same. From frame to frame, each triangle still consists of the same three vertices -- only the vertices undergo linear transformations. To hold each frame in an array, you create another array of type make_frame_list (Listing Four), each of which contains an array of vertex coordinates (Vertex 1, 2, and 3, respectively). There exists one copy of this array for each frame. Having filled all of the data structures, you can look up the coordinates of any polygon in any frame; see Listing Five (the coordinates of polygon P in frame F).

Texturing the Object

Quake2's model textures reside as separate PCX files, either in the pak0.pak file or quake2/baseq2 directory. Since OpenGL itself does not provide a way to read the binary PCX graphics file format, you can read the PCX file and pass its data to OpenGL.

Figure 4 describes the PCX format. The three basic sections in the file are the header, pixel data, and palette data. You can use two arrays of type unsigned char to store the last two sections. The header contains some basic information about the particular file, such as the PCX version, and the file dimensions. If the file is actually a PCX Version 5 file, the first two bytes in the file must be equal to 10 and 5, respectively. Having determined the image dimensions from the header section, you dynamically allocate an array of type unsigned char of size(width*height) for the pixel data and read it into the buffer byte-by-byte. Because a Version 5 PCX file can support exactly 256 colors, the size of the palette section is always 768 bytes (3*256, or RBG*256).

When the CImage::Read (char filename[]) function is finished, the m_pixel_buffer array is filled with all the pixels in the image, and m_palette_buffer contains consecutive RGB values for each of the colors.

How do you get the color of a specific pixel in the image? The pixel buffer simply contains index values of the palette buffer. Listing Six shows two methods. The R, G, and B components of the first pixel (pixel zero) in the image are Listing Six(a). However, because the palette array contains consecutive RBG values (RGBRGBRGBRGB...) for all the colors, the individual R, G, and B values at pixel position P are obtained by properly offsetting the array index; see Listing Six(b). Finally, to be able to reference color values at specific (X,Y) coordinates in the texture, P is substituted by X+Y*Width, where Width is the width of the texture; see Listing Six(c).

OpenGL

Once the necessary data is organized and stored in memory, you can start rendering using OpenGL. But first, some of OpenGL's texturing options must be set. In particular, you must specify how to treat textures when wrapped and indicate the "minification" and magnification filters (Listing Seven).

In addition, back-face culling and texturing have to be explicitly enabled. Since you won't be looking at the backsides of polygons, you only have to enable front-side filling of polygons. Lastly, you specify the texture function (Listing Eight).

OpenGL's glTexImage2D() is the function that actually textures the object. It expects to be passed, among other parameters, a pointer to an array containing successive RGBA values for each pixel in the texture (for example, RGBARGBARGBA...).

Thus, before calling glTexImage2D(), two changes must be made:

  • 1. The pixel and palette data read from the PCX file must be copied into another array, of a format that glTexImage2D() can accept as a parameter.
  • 2. Because OpenGL requires the dimensions of a texture to be powers of two, the texture has to be rescaled first using gluScaleImage().

Both of these steps are accomplished in CImage::Image2GLTexture(), which first creates a new array called unScaled, fills it with RGBA components, and rescales it to an appropriate size. The loop in Listing Nine fills a new array with RGBA components of each pixel in the texture, again offsetting the array indices as in the PCX code.

Now the texture contained within unScaled can be rescaled to have dimensions that are powers of two. To prevent the texture from losing much quality while keeping the performance at a reasonable level, a power of two that is closest to the original dimension will be used. For example, if the original width is greater than 256 pixels, the new dimension should be 512 pixels. If the original width is 128 or greater (but less than 256), the rescaled dimension should be 256. After a series of if statements have determined a good fit for the new dimensions, a call to gluScaleImage() rescales the texture (Listing Ten).

Finally, the glTexture array can be passed to OpenGL as follows: glTexImage2D(GL_TEXTURE_2D,0,4,scaledWidth, scaledHeight,0,GL_RGBA,GL_UNSIGNED_BYTE, glTexture);. Table 1 provides a quick explanation of the parameters.

Creating an OpenGL Rendering Context

WGL provides an interface between the Win32 API and OpenGL. It sets up a palette for your rendering window and handles such things as double buffering. To do this, you usually need to use four or five of the fewer than 20 WGL functions. I have written a basic C++ wrapper class for the functions that is easy to use. Most of the code in the COpenGLWindow class is taken from Silicon Graphics' OpenGL Developer Tools CD-ROM for Windows 95/NT, which interestingly has become a collector's item since SGI's "Fahrenheit" deal with Microsoft. (SGI is cooperating with Microsoft on the next generation of OpenGL. Since the agreement, SGI's, OpenGL drivers for Windows 95/NT have disappeared from the SGI web site, and the SGI OpenGL Developer CD-ROM for Windows 95/NT is hard to come by. However, there are several web sites mirroring its contents, including http://jawed.ncsa.uiuc.edu/.)

The dimensions of the rendering window are passed to the constructor, but its window handle must be passed to the OpenGLWindow::Create() class member function to actually create the rendering context.

WGL does not physically create a window for you; that is Win32's responsibility. WGL creates an OpenGL rendering context for a window that has already been created. If you want a window to create and destroy its OpenGL rendering context as the window is created and destroyed, simply catch the WM_CREATE and WM_DESTROY messages in the window's window procedure. Then call OpenGLWindow::Create() and OpenGLWindow::Destroy(), respectively, as has been done in inter.c's GraphicsProc function (available electronically). The only other time you really need to use WGL is for a system palette change. Windows will indicate that such a change has been made by sending a WM_PALETTECHANGED message to every window, and then OpenGLWindow::RedoPalette() will take care of the change.

Drawing the Entire Model

Inter.cpp's redraw() function (available electronically) redraws the entire model in its current state by specifying all of the triangle vertex coordinates and texture mapping coordinates between glBegin(GL_TRIANGLES) and glEnd(). This requires three calls to glTexCoord2f() (two parameters) and glVertex3f() (three parameters) for every triangle. One thing to note about the glTextCoord2f() function is that OpenGL expects texture-mapping coordinates to be relative, not absolute. To obtain these coordinate values, divide the original texture mapping coordinates from the model by their maximum range in the texture. In other words, divide the S component by the texture map's width and divide T by the texture map's height. These values will fall between 0 and 1 and remain unchanged when the texture is resized. For instance, (0.5, 0.5) will always point to the center pixel of the texture, no matter whether the texture dimensions are 173×233 or 256×256. Of course, doing a floating-point divide three times per loop is inefficient. By storing these values ahead of time the loop's efficiency could be improved greatly.

Between frame redraws the rendering window's window procedure keeps track of mouse movements and mouse button activity by listening to WM_MOUSEMOVE, and WM_*BUTTON(UP/DOWN) messages. The movement increments are then temporarily stored in two arrays -- one for translational movements, and another one for rotations. At the beginning of each frame redraw the linear transformations are carried out using glTranslate() and glRotate().

Conclusion

Although OpenGL is straightforward to use, simply knowing the API is not sufficient. Since OpenGL does not provide functions to read 3D model and texture files of your preferred format, a basic understanding of 3D concepts and some amount of manual data manipulation is also required. Combining Win32 with OpenGL makes it possible to develop applications with user-friendly interfaces and impressive 3D graphics.

Keep in mind that one of OpenGL's bonuses is portability. Porting your Win32 OpenGL applications to X under UNIX should not be much more difficult than cutting and pasting some of the graphics code. Of course, creating another interface from scratch will be necessary.

DDJ

Listing One

typedef struct{
     float x, y, z;  /* coordinates */
} make_vertex_list;

Back to Article

Listing Two

typedef struct{
    int a, b, c;    /* array indices */
    int a_s, a_t,   /* (s, t) texture coordinates */
    b_s, b_t,
    c_s, c_t;
} make_index_list;

Back to Article

Listing Three

<b>(a)</b>(vertex_list[index_list[0].a].x, vertex_list[index_list[0].a].y, 
                                       vertex_list[index_list[0].a].z)


</p>
<b>(b)</b>
(vertex_list[index_list[0].b].x, vertex_list[index_list[0].b].y, 
                                        vertex_list[index_list[0].b].z)


</p>
<b>(c)</b>
(vertex_list[index_list[0].c].x, vertex_list[index_list[0].c].y, 
                                        vertex_list[index_list[0].c].z)

Back to Article

Listing Four

typedef struct{
     make_vertex_list *vertex;
} make_frame_list;

Back to Article

Listing Five:

<b>(a)</b>frame_list[F].vertex[index_list[P].a].x
frame_list[F].vertex[index_list[P].a].y
frame_list[F].vertex[index_list[P].a].z


</p>
<b>(b)</b> 
frame_list[F].vertex[index_list[P].b].x
frame_list[F].vertex[index_list[P].b].y
frame_list[F].vertex[index_list[P].b].z


</p>
<b>(c)</b>
frame_list[F].vertex[index_list[P].c].x
frame_list[F].vertex[index_list[P].c].y
frame_list[F].vertex[index_list[P].c].z

Back to Article

Listing Six

<b>(a)</b>R: m_palette_buffer [ m_pixel_buffer[0]]
G: m_palette_buffer [ m_pixel_buffer[1]]
B: m_palette_buffer [ m_pixel_buffer[2]]


</p>
<b>(b)</b>
R: m_palette_buffer [3 * m_pixel_buffer[P]+0]
G: m_palette_buffer [3 * m_pixel_buffer[P]+1]
B: m_palette_buffer [3 * m_pixel_buffer[P]+2]


</p>
<b>(c)</b>
R: m_palette_buffer [3 * m_pixel_buffer[X + Y*Width]+0]
G: m_palette_buffer [3 * m_pixel_buffer[X + Y*Width]+1]
B: m_palette_buffer [3 * m_pixel_buffer[X + Y*Width]+2]

Back to Article

Listing Seven

glPixelStorei(GL_UNPACK_ALIGNMENT, 1);glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);

Back to Article

Listing Eight

glEnable(GL_CULL_FACE);glEnable(GL_TEXTURE_2D);
glPolygonMode (GL_FRONT, GL_FILL);
glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL);

Back to Article

Listing Nine

Glubyte *unScaled = new GLubyte [m_iWidth * m_iHeight * 4];for (j = 0; j < m_iHeight; j++) {
  for (i = 0; i < m_iWidth; i++) {
    unScaled[4*(j * m_iWidth + i)+0] = 
             (GLubyte) m_palette_buffer[3*m_pixel_buffer[j*m_iWidth+i]+0];
    unScaled[4*(j * m_iWidth + i)+1] = 
             (GLubyte) m_palette_buffer[3*m_pixel_buffer[j*m_iWidth+i]+1];
    unScaled[4*(j * m_iWidth + i)+2] = 
             (GLubyte) m_palette_buffer[3*m_pixel_buffer[j*m_iWidth+i]+2];
    unScaled[4*(j * m_iWidth + i)+3] = (GLubyte) 255;
  }
}

Back to Article

Listing Ten

/* allocate memory for the new rescaled texture */glTexture = new GLubyte [m_iscaledWidth * m_iscaledHeight * 4];


</p>
/* use the OpenGL function to rescale */
gluScaleImage (GL_RGBA, m_iWidth, m_iHeight, GL_UNSIGNED_BYTE, unScaled, 
             m_iscaledWidth, m_iscaledHeight, GL_UNSIGNED_BYTE, glTexture);


</p>
/* reclaim memory of the unscaled texture */
delete [] unScaled;

Back to Article


Copyright © 1998, Dr. Dobb's Journal

Related Reading


More Insights






Currently we allow the following HTML tags in comments:

Single tags

These tags can be used alone and don't need an ending tag.

<br> Defines a single line break

<hr> Defines a horizontal line

Matching tags

These require an ending tag - e.g. <i>italic text</i>

<a> Defines an anchor

<b> Defines bold text

<big> Defines big text

<blockquote> Defines a long quotation

<caption> Defines a table caption

<cite> Defines a citation

<code> Defines computer code text

<em> Defines emphasized text

<fieldset> Defines a border around elements in a form

<h1> This is heading 1

<h2> This is heading 2

<h3> This is heading 3

<h4> This is heading 4

<h5> This is heading 5

<h6> This is heading 6

<i> Defines italic text

<p> Defines a paragraph

<pre> Defines preformatted text

<q> Defines a short quotation

<samp> Defines sample computer code text

<small> Defines small text

<span> Defines a section in a document

<s> Defines strikethrough text

<strike> Defines strikethrough text

<strong> Defines strong text

<sub> Defines subscripted text

<sup> Defines superscripted text

<u> Defines underlined text

Dr. Dobb's encourages readers to engage in spirited, healthy debate, including taking us to task. However, Dr. Dobb's moderates all comments posted to our site, and reserves the right to modify or remove any content that it determines to be derogatory, offensive, inflammatory, vulgar, irrelevant/off-topic, racist or obvious marketing or spam. Dr. Dobb's further reserves the right to disable the profile of any commenter participating in said activities.

 
Disqus Tips To upload an avatar photo, first complete your Disqus profile. | View the list of supported HTML tags you can use to style comments. | Please read our commenting policy.