The term "HTML5" gets tossed around a lot lately, as if simply stating that you are using HTML5 means your website or Web app is automatically superior. The reality is quite a bit different. HTML5 is a great addition to the HTML spec and it can certainly be used to improve the Web experience when it's applied, but it has not led to an explosion of new Web apps. In fact, applications on the Web are still the exception rather than the rule. Why is that?
HTML5 contains a wide variety of welcome additions to the HTML spec. A few notable examples are
LocalStorage, built-in support for audio and video, as well as many features that make data entry and data validation easier. This is just a taste of the goodness that is HTML5. The
Canvas element enables higher performance, client-side drawing and makes more interactive user interfaces possible.
LocalStorage allows for data to be stored on the user's device rather than in some central server. And built-in support for audio and video means that these functions no longer require plugins. HTML5 is tremendously important for mobile Web content because you can use it to optimize for speed and reduce bandwidth, which are both important when accessing anything via the Internet from the cellular service of a mobile device.
That's all good stuff and it's just a part of HTML5. There are some great Web applications that take advantage of these features, Google Docs and Facebook being the most well-known. Still, why is it that there are so few Web applications and why do we still rely so much on our desktop applications? One reason is that Internet Explorer, which is still the dominant browser, only recently added support for HTML5 and that support is not complete. However, I think if we look at the history of software development in general, we will find another important reason.
When my father first started writing software, he wrote machine code by flipping 16 physical switches (one per bit) and pressing a button to input them. Later on, machine code was replaced by assembly language and then by what were called "higher-order" languages such as BASIC, Pascal, and later C. When the graphical user interface came along for desktop computers, development tools were created that took a lot of the busy work (such as coding the user interface) and made it visual. This allowed the developer to focus on what really makes their application unique. As processors have gotten faster and storage has become cheaper, it has become possible to abstract developers further and further from the "metal" as they say, making software development faster and more accessible to mere mortals. Almost no one programs in machine code these days and there are few who could.
I witnessed the primitive nature of Web development first hand. While watching a Web developer coding up the HTML for a page, I asked him why he wasn't using a visual, drag-and-drop tool to create the Web page. "No one uses those anymore," he told me. When I asked why not, he replied, "Because those editors add lots of meta tags to your HTML so they can load the page back into the editor. Eventually, you will have to edit the HTML yourself and those meta tags just mess up your HTML and get in the way." The same developer later explained to me how different the various Web rendering engines can be, requiring developers to test on all popular browsers and learn each browser's eccentricities despite these technologies all being standard.
So why aren't there a lot more Web applications? Why haven't we seen all of our desktop apps moving to the Web? Because developing Web applications requires a developer to spend significant time learning the assembly language of the Web; these technologies require much more time to work with; there are few visual tools to use; and there are at least three different Web rendering engines developers need to understand well so they can deal with their eccentricities.
If we are going to see an explosion of Web applications, we need to take advantage of all the processing power we have to provide developers with the next layer of abstraction. This abstraction layer should remove the necessity of knowing the assembly language of the Web. It should provide visual development for creating the user interface. Finally, it should abstract the developer from the differences between various browsers. This Web abstraction layer, by removing the nitty-gritty details, also makes Web development accessible to mere mortals. This last bit is key because you won't have an explosion of Web applications without lots of developers. HTML5 alone is not going to be enough; not even close.
Before we will ever see Web apps really take off, development must be a great deal faster and far easier to learn. There are tools available on the market today that offer developers an abstraction layer for Web apps, but this premise is still lacking the widespread acceptance needed to fulfill the promise of the Web.
Geoff Perlman is the founder and CEO of Real Software, makers of Real Studio.