OpenGL and Mobile Devices: Round 2

It's been a couple of years since Richard wrote about the intersection of OpenGL and mobile devices. And my word—the world has changed, thanks in part to devices such as Apple's iPhone.


July 24, 2008
URL:http://www.drdobbs.com/mobile/opengl-and-mobile-devices-round-2/209600498

Richard is a developer for Software Bisque, lead author of The OpenGL SuperBible, and a professor at Full Sail University in Winter Park Florida where he teaches OpenGL programming in the game design degree program. He can be reached at [email protected].


In my June 2006 Dr. Dobb's article "OpenGL and Mobile Devices" (www.ddj.com/mobile/187203532), I examined OpenGL ES, a subset of OpenGL for mobile devices. Since then, the world of mobile devices has radically changed. Mobile devices with 3D graphics are readily available, as are downloadable SDKs from Sony, Nokia, and others. And then there's Apple's iPhone and iPod Touch.

When first announced, what caught my attention was Steve Jobs saying that the iPhone was running a version of OS X. Like every other Mac developer, the first thought that ran through my mind was "I can write software for that thing."

The second thing that went through my mind was "If it's OS X, then it has to be using OpenGL" because OpenGL,the de facto standard for cross-platform real-time 3D graphics, is an integral part of the Macintosh operating system. All we needed was an iPhone SDK. Well, the SDK is finally available and it's a complete XCode suite—compiler and debugger, performance analysis tools, a desktop simulator, sample code, and really good documentation. Frameworks are available for all the core OS X technologies, plus iPhone/iPod Touch specific features such as a touch interface, camera, accelerometer, and (my favorite) OpenGL ES.

The iPhone and iPod Touch both support OpenGL ES 1.1. Of particular interest to me are Vertex Buffer Objects, a minimum of at least one user-defined clipping plane, point sprites, some point parameters (distance attenuation), mipmapping, vertex skinning, and an extension for using a texture as a background image. This is a very capable and feature-rich graphics API, and it's 100-percent hardware accelerated.

The Hardware

The graphics hardware behind the iPhone and iPod Touch is a PowerVR MBX Lite, which uses Tile-Based Deferred Rendering. (Sony Ericsson uses this same chip for some of its cell phones.) In addition to Apple's SDK, you should get the PowerVR developer's notes that include useful performance tips (www.imgtec.com).

There are a few limitations you should know from the start:

The PowerVR chip uses a full floating-point pipeline throughout. The OpenGL lighting model is fully hardware accelerated, and there is no need to use fixed-point values for either lighting and material values, or vertex data. For best performance, use directional lights instead of point lights when possible, and try to always use indexed strips for geometry submission. To minimize bandwidth, you can use unsigned byte values for colors, and either unsigned byte or shorts instead of floats for texture coordinates.

Mipmaps (optimized collections of bitmap images accompanying main textures that increase rendering speed) were not available with OpenGL ES 1.0, but are with 1.1. On the PowerVR hardware, bilinear filtering is considered "free", but you should limit the mipmapping mode to GL_LINEAR_MIPMAP_NEAREST for best performance. Loading textures with an internal format of 565 instead of 888 speeds things up, and (if possible) you may want to make use of the PowerVR's native texture compression formats of either PVRTC2 or PVRTC4.

One of the more interesting aspects of the PowerVR chip is that it uses Tile-Based Deferred Rendering (TBDR), in which all rendering commands are queued and no actual rendering occurs until all commands for a given frame are given. The rendering commands are then analyzed, geometry is sorted, and converted into strips. Rendering then occurs. You still need to sort by alpha for transparency, but sorting front-to-back to eliminate overwrite is no longer necessary. (This is somewhat like the "early-z" feature some desktop graphics hardware supports.)

While this has many performance advantages, there is one performance gottcha for desktop OpenGL developers coming to this platform—state changes can be very expensive. While reducing state changes has always been a staple of OpenGL performance techniques, many desktop systems only moderately benefit from excessive texture or other state changes. Nongaming applications typically ignore this advice completely, yet are still able to maintain relatively fast and interactive frame rates.

Not so on the iPhone. One of my personal rendering "magic tricks" involves swapping between mipmapped and nonmipmapped texture modes using the same texture object. When I tried this on my iPod Touch, it ran so slow at first that I thought I was getting a software fallback (which, it turns out, does not exist).

To get best results in a complex rendering (especially if you are making a game), you should architect your rendering code to sort geometry based on rendering state, and not worry about front-to-back sorting as is typical on desktop systems.

Cocoa and Objective C

Probably the biggest perceived hurdles for entry into Mac OS X programming is Objective C ("C with objects") and Cocoa (Apple's object-oriented application framework). On the iPhone, Cocoa is extended to "Cocoa Touch", which is designed around the touch interface for user interaction. I use Objective C when necessary, but I like to share a lot of code between the Mac and Windows versions of my software. So an early trick I learned was how to bridge Objective C to C++ and vice versa.

If you're new to the Mac development environment, the most important thing I can tell you is that you do not have to use Objective C for everything. Not only can you use C and C++ for desktop and iPhone development, but you can code C and even C++ classes right in an Objective C source module (for C++, you must rename the files from *.m to *.mm). Even better, it is easy to link Objective C-based Cocoa code with C or C++ code modules. It is also a common misunderstanding that all OS X APIs are exposed via Cocoa. Many APIs, such as OpenGL, are indeed C APIs that can be called from just about any programming language that supports the C calling convention.

Building a Bridge

XCode 3.1 comes with a new template for Cocoa Touch OpenGL applications for the iPhone. When creating a new project, just select this template, name your project, and set the folder where you want the source files. Figure 1 shows the default skeleton OpenGL ES application running on the desktop with the iPhone emulator. It is basically nothing more than a spinning colored square (actually, a short triangle strip). Of course, the application is all Objective C and Cocoa. In time, third-party frameworks or open-source projects akin to GLUT or SDL will spring up to enable OpenGL and/or game development on the iPhone without having to master the details of OS X, Objective C, or Cocoa Touch. (I'm hoping Trolltech will bring Qt to the iPhone.)

In the meantime, I have started to create my own C++ bridging framework for iPhone OpenGL game development. My associates and I at Full Sail University (www.fullsail.edu) are planning on creating an iPhone game programming lab and including it in our curriculum. With a little Cocoa knowledge, it is trivial to wire in a C++ bridge class so that programmers who are either more comfortable with C++, or have other technical considerations for doing so can get going with the iPhone SDK.

[Click image to view at full size]

Figure 1: OpenGL ES app running on the desktop with the iPhone emulator.

Listing One is a basic portable OpenGL C++ rendering framework. (The source code for the complete framework is available online at www.ddj.com/code/.) This is just from the header of a simple C++ class that defines the four basic things all OpenGL programs need to do:

This simple framework keeps portable OpenGL and C++ code far away from system internals, Objective C, and Cocoa. I've used this to move code easily from not only OS to OS, but between different application frameworks—Carbon, Cocoa, and Qt on the Mac, and low-level C/C++ SDK code, MFC, Qt, WxWidgets, GLUT, SDL, on Windows and others. The trick is to know where in the calling framework to create the class, and call these four member functions.

   class MyGLView
   {
   public:
     MyGLView(void);
     void ResizeGL(int nWidth, int nHeight);
     void InitializeGL(void);
     void ShutdownGL(void);
     void PaintGL(void);
   };
Listing One

For this article, I ported from my book The OpenGL SuperBible an example program that loads and displays a model of a ThuderBird (thanks to Ed Womack for providing the model). This example, running on an iPod Touch in Figure 2, uses Vertex Buffer objects and indexed triangles. In the book, I used a cube map to make the cockpit glass look glassy. Alas, the one compromise I had to make for the iPhone is that cube maps are not supported by OpenGL ES 1.1.

[Click image to view at full size]

Figure 2: Sample application running on an iPod Touch.

In the XCode-generated project, a Cocoa class EAGLView (derived from UIView) sets up the OpenGL context and does all the dirty work. It is straightforward to then wire in this class. The first step is to rename all the .m files in the project to .MM to allow C++ classes to be used in Objective C code. The XCode template creates a Cocoa view class called EAGView, and stubs in all the necessary OpenGL hooks. This is the class in which I bridge to my own C++ implementation.

In EAGLView.h, add the include for MyGLView.h and declare an instance of the class (Listing Two). In Listing Two, the OpenGL ES include files leave EAGL.h out of any C++-only source files (as well as UIKit.h, which is Cocoa GUI code), but you will need the gl.h and potentially the glext.h headers in any source module that calls OpenGL ES functions.

   #import <UIKit/UIKit.h>
   #import <OpenGLES/EAGL.h>
   #import <OpenGLES/ES1/gl.h>
   #import <OpenGLES/ES1/glext.h>
   #include "MyGLView.h"

   @interface EAGLView : UIView {
   @private
    ...
     MyGLView myGLView; // Instance of our C++ class
   }
   @property NSTimeInterval animationInterval;
   - (void)startAnimation;
   - (void)stopAnimation;
   - (void)drawView;
   @end
Listing Two

In EAGLView.mm, you find the initWithCoder method. Here, after the context creation, you place the code to tell your class to do its initialization, and set up the projection. Because you can't change the window size on the iPhone, you don't need to watch for changes and call ResizeGL again:


myGLView.InitializeGL();
myGLView.ResizeGL(backingWidth,     backingHeight);
animationInterval = 1.0 / 60.0;	

You can also set the desired target frame rate. The buffer swap is always synced and there is never any tearing on the iPhone. Sixty fps will always be the top frame rate you can attain.

In drawView, I call the PaintGL method of the C++ class:


- (void)drawView {
  [EAGLContext setCurrentContext:context];
  glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
// New line
  myGLView.ResizeGL(backingWidth, backingHeight); // New line
  myGLView.PaintGL();
  glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
  [context presentRenderbuffer:GL_RENDERBUFFER_OES];
}

Finally, when the view is destroyed (the application is terminated), dealloc is called and I call ShutdownGL to do any cleanup before the OpenGL context is destroyed:

- (void)dealloc {
   [self stopAnimation];
   myGLView.ShutdownGL(); 	// Do our shutdown
   if ([EAGLContext currentContext] == context) {
 [EAGLContext setCurrentContext:nil];
   }
   [context release];	
   [super dealloc];
}

With a successful bridging of C++ code to the Cocoa Touch rendering framework, you can now create C++-based OpenGL ES rendering code for the iPhone.

Files and Resources

The last loose end is file loading. At the very least, you will want to load textures. To demonstrate the opposite of what we have just done, the GLTextureHelper class has a C++ style header that can be included in any C++ code. The implementation of the class is, however, in an Objective C .mm file. While the class itself is a C++ class, it makes use of Cocoa to load any Quartz-supported image file format and load it as an OpenGL texture. This class is implemented in the source code files GLTextureHelper.h and GLTextureHelper.mm. Using the class is simple. First declare an instance of the class:


GLTextureHelper helper(1024, 1024);
	

The only parameters to the constructor are the maximum expected size in width and height of any textures you plan to load. Here, I use the maximum size supported by the iPhone. Rather than allocate memory repeatedly for each texture loaded, this class reuses a single block of memory to avoid repeated malloc/free cycles.

The LoadTexture method then loads the specified texture file and calls glTexImage2D with the appropriate parameters. For example, to initialize a texture object and create mipmaps for the body.png texture:

	
// Load the body texture
glBindTexture(GL_TEXTURE_2D, textureObjects[BODY_TEXTURE]);
glTexParameteri(GL_TEXTURE_2D, GL_GENERATE_MIPMAP, GL_TRUE);
helper.LoadTexture("body.png");    
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
    GL_LINEAR_MIPMAP_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

Textures are placed in the \Resources folder. Right-click and select "Add Existing Files" and you can bring in your own textures.

Conclusion

It's exciting that independent developers can now get their hands on OpenGL ES devices and immediately start writing code. iPhones represent only one exciting opportunity to take advantage of the growing prevalence of mobile 3D devices.

The example code accompanying this article is a great place to start with iPhone (and iPod Touch) development. While not a complete development framework, it brings within reach GLUT-style technology demos and proof-of-concept prototypes.

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.