It's no secret that the bulk of UI and UX (user experience) work is being done today on mobile devices, both mobile phones and tablets. During the first three or four years in which mobile emerged as an important development platform (that is, since the 2007 launch of the Apple iPhone), its UI design and development ran on a separate track from desktop metaphors. However, it is now clear the two styles are merging. And the merger will look far more like mobile than desktop.
White PapersMore >>
- Asset Management For Electronics Industry
- How to Mitigate Fraud & Cyber Threats with Big Data and Analytics
Perhaps the most dramatic step in that direction was the release of Microsoft's Windows 8 Metro UI last year, which was an obvious attempt to move the desktop to a mobile UX. Rather than relying entirely on desktop metaphors for its flagship OS, Redmond chose to embrace its mobile Metro UI characterized by scrollable, blocky, colorful, tile-based elements. Metro works very well on Windows-based phones. Users of those devices are not presently numerous, but they are consistent in their appreciation of the UI. On tablets, such as the Win8 tablet I use (not a Microsoft Surface, but a Samsung Slate), the UI works well, although I occasionally pop into the Windows 7 desktop metaphor due to its familiarity. On laptops and desktops, however, Windows 8 has encountered considerable resistance. I recently bought a test system laptop that runs Windows 8 and, I confess, using the new UI is truly frustrating and ultimately tiring. The poor reception of these devices makes sense. In the laptop version, the leap to the mobile metaphor is simply too great.
But Microsoft's error with Metro on the laptop and desktop is, in my view, simply a case of going too far too fast. It does not negate the fact that these platforms will eventually move to UIs driven by the mobile experience. A key part of that mobile experience is gesture recognition. As yet on business-style laptops and desktops, there is no gesture recognition capability, but that's beginning to change. In our lead feature article this week, Mike Riley looks at a new gestures SDK from Intel that relies on a Web-cam to map user actions to events and objects on the screen. I expect such tools will eventually be built into laptops and desktop displays. Already, some laptops have touch-sensitive screens a la tablets.
Desktop apps, just like Web apps, are also showing the effects of mobile. This is clearly visible in the new designs of icons and dialog boxes. The latter, which were often complex, multi-paneled widgets that required lots of interaction, have now been greatly simplified with far fewer options in a single pane. In addition, the use of widgets that can substitute for typing data values is becoming more widespread. For example, sliders are now much more common as a way to enter values and they will continue to gain popularity. Likewise, spinner controls.
Drop-down menus are evolving, too. The former style of multiple cascading menus is being replaced. Drop-downs today have a smaller range of options (due to mobile screens being so small and the need to have the entries big enough that a finger touch can select it), and they never use the cascading menu. The latter is being replaced by a dialog. The dialog might have several panes if there are many choices to make but the choices will be presented in small numbers with easy selection options.
Icons, too, are changing. Because of their reduced size on small screens, they typically make little use of colors. Hence, the decision on Visual Studio to use mostly black icons. (There is a certain confusion here, as Visual Studio is not likely to be run on devices other than laptops and desktops. Even though technically I can run it on my tablet, it's clearly not in its element there and its usability is greatly compromised on such a device. So, as I mentioned in my review of Visual Studio 2012, the design choice of mobile UI elements in Visual Studio seems a poor one. Like Windows 8, it's way too premature and it simply feels like Microsoft pushing Metro much too hard.)
In Web-based apps, the mobile metaphors are finding greater traction as well. One need only look at the new Google Mail (GMail) interface and see how it's changed over the last year to view the effects of this new direction: All icons are monochrome, the number of buttons is very limited, and there's a More button that keeps the additional options off the main screen. That button leads to a short drop-down menu of additional actions. The UI is comparatively clutter-free and the most common actions can be performed by a single response (click, press, etc.) from the main menu.
Of all the constraints that mobile devices impose, the one I'd most like to see migrate to the laptop and desktop is the intolerance for delays. On a phone or a tablet, the allowed delay between a gesture and a device response is about a half a second. Any more than that, and users will start repeating the action or shaking the device. On laptops and desktops, the delay is much longer: It takes several seconds, often more than five seconds, for a user to start trying remedial actions. Software has always felt comfortable stretching this limit. Despite the benefits of faster processors and more cores, applications have not seemed to get any faster in their response times. Let us hope the new tide changes this.
Either way, it's clear that the center of the UI/UX universe has moved to mobile and the concepts being pioneered there are being back ported to laptops and desktops, as well as Web apps. If this trend means removing my hand from the mouse or keyboard to get things done, I don't see myself embracing it soon. However, if it continues to emphasize simpler interactions, easier navigation, and more responsive applications, I'm all for it.