Transforming Technologies for Mobile 3D Computing

New open standards and software technologies promise to fundamentally change the type of content and applications that can be deployed on mobile devices.


May 17, 2005
URL:http://www.drdobbs.com/transforming-technologies-for-mobile-3d/163105163

In the first part of this two-part series we examined the emerging mobile 3D market and considered several of the many forms that three-dimensional content may take as it begins to appear on a new generation of hardware-accelerated mobile devices.

In Part 2 of this series we'll examine a number of the software technologies that developers will use to create mobile 3D applications. Specifically, we'll focus on a core group of open standards and technologies, namely Extensible 3D (X3D), Java Mobile 3D Graphics APIs, and OpenGL for Embedded Systems (OpenGL ES). Along the way we'll describe the organizations and key corporate members behind these technologies in hopes of shedding some light on the size and scope of these industry-led efforts.

Web3D and MPEG-4

The term "Web3D" describes a variety of Web- and Internet-based 3D technologies designed and developed by (or otherwise endorsed by) the Web3D Consortium. As a nonprofit organization established to provide a forum for the creation of open Web3D standards and specifications and to accelerate the worldwide demand for products based on these standards through the sponsorship of market and user education programs, the Web3D Consortium traces it roots back to 1994 and the Virtual Reality Modeling Language (VRML).

Originally known as the VRML Consortium, the organization was renamed Web3D Consortium in 1998 to reflect an expanded charter that today governs the organization's activities. The Web3D Consortium's charter delineates goals to 1) Foster the ongoing development of Web3D specifications and standards and promote rapid industry adoption of these technologies, 2) Offer opportunities for the 3D development community to meet and cooperate in the evolution of Web3D technologies, 3) Educate the business and consumer communities on the value, benefits and applications of 3D technologies on the Internet, 4) Support the creation of conformance tests to assure Web3D interoperability, and 5) Liaison with educational institutions, government research institutes, technology consortia, and other organizations which support and contribute to the development of specifications and standards for Web3D.

Web3D Consortium members include a number of leading high-technology companies, government agencies, and organizations, including 3Dlabs, Naval Postgraduate School, US ARMY PEO-STRI, NASA, National Institute of Standards and Technology (NIST), Autodesk, Sun Microsystems, Hewlett Packard (HP), NVIDIA and others active in the 3D industry. In addition, academic members such as The Mitre Corporation, Virginia Tech, Communications Research Center of Canada, and SRI International compliment more than 100 professional (individual) and student members. In addition, the Web3D Consortium maintains formal Liaison relationships with related standards groups such as the International Organization for Standardization (ISO), the World Wide Web Consortium (W3C) and Moving Picture Experts Group (MPEG) to ensure interoperability and compatibility at the standards level.

By following a time-tested and open process that has evolved over the past decade, the Web3D Consortium and its members have designed and developed core 3D technologies such as VRML and Extensible 3D (X3D) and numerous related technologies such as Universal Media, Humanoid Animation (H-Anim), geographical extensions to VRML (GeoVRML), Virtual Reality Transport Protocol (VRTP), Distributed Interactive Simulation (DIS), and more. Source code for these and other open technologies developed by the Web3D Consortium is available directly from the Working Group sites related to each activity. VRML and X3D are the foundation technologies upon which all other Web3D Consortium technologies build, enhance, or are related to. Although VRML was never created with today's mobile devices in mind, some companies have managed to develop mobile players for the technology nonetheless. Cortona by ParallelGraphics, for example, is a popular VRML content player available for Windows, Macintosh, and Pocket PC platforms. Until recently ParallelGraphics also provided a free cross-platform Java applet called Cortona Jet that enabled a small but functional subset of VRML content to be displayed on any device that is equipped with a Java-enabled Web browser.

Although VRML is a monolithic 3D standard that wasn't created in the era of mobile devices, its official successor is a different story altogether: X3D was designed to enable new opportunities in the creation and deployment of state-of-the-art 3D graphics on small, lightweight clients, and also to enable the integration of high-performance 3D into broadcast and embedded devices. To this end X3D addresses a number of long-standing issues with VRML while pushing the envelope for 3D both on and off of the Web. In particular, X3D was created to deliver "3D anywhere" by employing an advanced component-based architecture that can scale across a wide range of devices and platforms.

The Web3D Consortium started development of X3D in the late 1990s under the tentative name "VRML Next-generation (VRML NG)" specifically to advance the state of the VRML standard previously ratified by ISO in 1997 as International Standard ISO/IEC 14772-1:1997. VRML NG was later renamed "X3D" thanks to significant new extensibility capabilities and support for Extensible Markup Language (XML) encoding.

Earlier this year X3D was approved by ISO as International Standard ISO/IEC 19775. Not only is X3D the official successor to VRML, it is now a recognized global standard for 3D that companies can implement today. Mobile technology developers, in particular, will find X3D's Core profile, Interactive profile, Interchange profile, and MPEG-4 interactive profile particularly valuable since they are designed specifically for implementation using a "low-footprint engine".

Like the VRML standard that it supercedes, X3D is an open, royalty-free standard with a corresponding Open Source implementation provided free of charge by the Web3D Consortium. The Moving Picture Experts Group (MPEG) has already accepted X3D for the baseline 3D capabilities of MPEG-4. As a result of a cooperative joint development between the Web3D Consortium and MPEG though the Web3D-MPEG Working Group that I chair, X3D's MPEG-4 interactive profile is now a part of the ISO/IEC 14496 MPEG-4 specification that will formally become an International Standard in January 2005 (additional mobile 3D technologies that will compliment X3D are also being defined for MPEG-4 by MPEG). In addition to collaborating with MPEG, the Web3D Consortium continues to work closely with the World Wide Web Consortium (W3C) and other standards organizations in anticipation of further adoption of X3D across the industry. Whereas VRML is a monolithic all-or-nothing standard, meaning a VRML software product must implement all of the features and capabilities defined by the VRML specification in order to be considered compliant with the standard, X3D supports the concept of components and profiles that make it infinitely more flexible and customizable than VRML by comparison. Components and profiles, in turn, enable X3D to scale gracefully from low-powered mobile devices to extremely powerful scientific workstations and all stops in between.

X3D Components: An X3D component is a set of related functionality that consists of various X3D objects and services. Although a component is usually a collection of X3D nodes, it can also include encodings, API services, or other features. X3D revolves around a Core component that defines the base functionality (such as the abstract base node type, field types, event model, and routing) required for the X3D run-time system. In short, the Core component provides the minimum functionality required by a compliant X3D implementation. In addition, X3D defines standard components for a variety of capabilities, including Geometry, Appearance, Time, Lighting, Sound, Navigation, Scripting, Text, Texturing, NURBS, Humanoid animation (H-Anim), and more. Advanced components to support features such as subdivision surfaces, shaders and other sophisticated 3D capabilities are also under development. Vendors can implement any of the pre-defined standard components or define their own for private use. They can also publicly register their proprietary components with the Web3D Consortium.

X3D Profiles: An X3D profile is a named collection of functionality and associated requirements that must be supported in order for an implementation to conform to that profile. Profiles are further defined as a set of components and corresponding support levels, as well as the minimum support criteria for all of the objects contained within that set. The X3D specification specifies the following five profiles:

Developers intent on creating 3D applications for mobile devices today face the problem that, to date, broad industry-wide application programming interface (API) standards have not yet emerged that support 3D development for small-scale (low-end) embedded technologies. For Java developers, this problem will soon be solved with the expected completion of the Mobile 3D Graphics API for Java 2 Micro Edition (J2ME), currently in the final stages of the Java Community Process (JCP), and associated standards.

Industry-wide standards for 3D graphics rendering, such as OpenGL and DirectX, have been available on traditional computers for many years. These standards have paved the way for the development of higher level frameworks and APIs such as, for example, the Java 3D API available for high-powered computing devices such as desktop computers and scientific workstations. Adoption of these standards for use in 3D application development for low-end consumer and embedded devices is problematic, however, due to the large footprint imposed by underlying implementations and heavy utilization of hardware acceleration that is commonly available for personal computers, dedicated game consoles, and high-end visualization workstations. Two emerging Java technology specifications are poised to solve this problem; the Mobile 3D Graphics API for J2ME (JSR 184) and the Java Bindings for OpenGL ES specification (JSR 239). The Mobile 3D Graphics API for J2ME specification development process was led by Nokia working in collaboration with Motorola, Sony Ericsson, Cingular Wireless, France Telecom, Intel, and Symbian, to name just a few of the companies involved. The Java Bindings for OpenGL ES specification, meanwhile, is led by Sun Microsystems working in combination with a number of the same companies. Whereas the Mobile 3D Graphics API was finalized in December 2003, work on the Java Bindings for OpenGL ES is ongoing (as of this writing the work was progressing through the JSR process and had been generally approved through a member balloting process earlier this year).

Mobile 3D Graphics API for J2ME: The Mobile 3D Graphics API for J2ME is a lightweight, high-level API designed to augment J2ME and Mobile Information Device Profile (MIDP) as an optional package. As a high-level API, the Mobile 3D Graphics API will enable an object-oriented approach to 3D application development using the well-known scene graph data structure (a scene graph is a hierarchical arrangement of nodes specifying objects and properties in a 3D scene and was popularized years ago by VRML). The Mobile 3D Graphics API scene graph representation is rooted in a World node and includes a number of node types common to other high-level 3D development systems. The Mobile 3D Graphics API support node types such as Group, Light, Mesh, Sprite, and Camera, among others.

At a lower level, the Mobile 3D Graphics API solves the problem of supporting application development across a wide range of devices by guaranteeing that 3D rendering is performed using native code that takes advantage of hardware acceleration when it is available on the underlying runtime environment. Consequently, applications created with the Mobile 3D Graphics API are expected to run on a spectrum of devices that have a range of capabilities, from low-end J2ME Connected Limited Device Configuration (CLDC) systems that have low memory and no hardware support for 3D graphics or floating point math to more capable mobile devices that have larger amounts of memory and hardware acceleration. In addition, the Mobile 3D Graphics API defines its own binary file format to optimize efficient downloads over wireless networks.

Java developers can use Sun's freely available Wireless Toolkit to create mobile applications that take advantage of the Mobile 3D Graphics API. Available through Sun's J2ME site, version 2.2 of the Wireless Toolkit was released last month and comes with the build tools, utilities, and a device emulator necessary to develop and test mobile Java applications. In addition to supporting the Mobile 3D Graphics API, the award-winning Wireless Toolkit also supports a range of important mobile Java technologies that have emerged from the open JSR process, including the Mobile Media API, Java APIs for Bluetooth, and J2ME Web Services Specification.

Java Bindings for OpenGL ES: Toward supporting a wide range of device configurations, the Mobile 3D Graphics API supports two rendering modes: retained mode and immediate mode. The retained mode representation addresses high-level scene graph representations whereas immediate mode addresses low-level rendering that is aligned with a subset of the OpenGL standard known as OpenGL for Embedded Systems. Specifically, when using the high-level retained mode Java developers interact with scene graphs while the 3D content described by these data structures renders itself automatically based on camera and light positions. In contrast, the low-level immediate mode enables developers to take control of the rendering process so that they can draw 3D content to the screen when and how they desire. Developers can use either of these modes, and can even use them both simultaneously if desired.

OpenGL ES is developed by the Khronos Group, a member-funded industry consortium founded in January 2000 to create open, royalty-free APIs for authoring and playback of media on a range of platforms and devices. Khronos members include 3Dlabs, ATI, Discreet, Evans & Sutherland, Intel, NVIDIA, SGI, Sun Microsystems, Fujitsu, Mitsubishi Electric, PalmSource (the company that creates the Palm OS), Symbian, Panasonic, QUALCOMM, Samsung Electronics, Sony, Toshiba, and a number of other leading media and electronics companies.

OpenGL ES is a royalty-free, cross-platform API for 2D and 3D graphics on embedded systems (including handheld devices, appliances and vehicles) that is comprised of a well-defined subset of desktop OpenGL. It is designed to handle low level implementation details related to interfacing between software and graphics acceleration which is expected to become increasingly prevalent in embedded devices. Java Bindings for OpenGL ES, also known as JSR 239, addresses the need for a specification describing the way in which higher-level API components will bind to native 3D graphics libraries implemented using the OpenGL ES standard.

In this sense, OpenGL ES is a complementary technology to the Mobile 3D Graphics API for J2ME. OpenGL ES is available to developers requiring a low-level API that facilitates 3D graphics rendering using a subset of the OpenGL standard. The Mobile 3D Graphics API, in contrast, provides a higher level scene graph oriented API along with a lower level mode that aligns with OpenGL ES to facilitate the development of efficient and portable embedded 3D applications using industry standards.

Summary

The world of mobile 3D is evolving rapidly thanks to new hardware and software technologies that promise to fundamentally change the type of content and applications that can be deployed to mobile devices. In many respects the state of the mobile 3D industry can be compared to the early days of the World Wide Web, and it will only heat up as more powerful mobile computing devices find their way into customer hands, which in turn grows the overall market size for 3D content and applications. As reported by Web3DNews.com, nearly every personal computer created today has the horsepower to handle rich, interactive three-dimensional content thanks to hardware accelerators and 3D software technologies similar to those expected to power the next generation of mobile devices. Much like the 3D revolution that began in the personal computer industry a decade ago, today all of the enabling technologies are in place for the mobile computing industry to experience a three-dimensional transformation of its own.

Aaron Walsh is Director of the Web3D Web (web3dweb.com) and Media Grid (mediagrid.org). He can be contacted at www.MediaGrid.org/people/aew/.

Nicholas Nagel is a Java instructor for Sun Microsystems (sun.com), and Web3D Web fellow (web3dweb.com). He can be contacted at [email protected].

Terms of Service | Privacy Statement | Copyright © 2024 UBM Tech, All rights reserved.