Earlier in life, I worked for Price Waterhouse (the merger that would add "Coopers" to the name was still a thing of the future). I was in charge of the firm's technology forecasts. These were massive books, written by external experts in conjunction with internal technologists, that covered every aspect of technology. While the analysis was frequently written by the external specialists, the predictions were written in house. As such, we had a professional responsibility to forecast trends and the added responsibility of figuring out how to do it right. This is unlike the situation with current pundits who, at this time of year, love to post their predictions often just presenting the reality they hope for or a scenario that's controversial enough to draw page views. At Price Waterhouse, in contrast, we were charged with coming up with predictions that would be useful for IT managers in their decision making. As such, it required a finer understanding of what can intelligently be said.
The first thing we did was remove the constraint of predictions for "this year." Prognostication is impossible if you have to both predict events correctly and state whether they will occur within the next 12 months. A full year is both a very long and very short period in tech. So, our predictions were, at their shortest, aimed at the next three years.
In terms of their content, we found that predictions fall into two broad categories: those that are driven by quantitative analysis and those that derive more from pure analysis of trends and speculation on their direction. The two often work hand in hand. For example, a quantitative prediction might be: "At the current rate, the number of consumers using tablets or phones as their primary computing device will surpass those who use PCs and laptops in the next 12-18 months." From that position, it's then possible to make qualitative conjectures: "Mobile apps will become the dominant programming genre outside the data center."
The qualitative predictions and insights fall into three broad categories:
- Interesting predictions. These are predictions that make the reader think about unexpected possibilities, which are plausible and edging toward probable. They paint a picture of a reality that is engaging and instructive. Such a prediction today might be: "In the next three years, all data storage will by default use encryption." Or: "In the next three years, all enterprises will be relying on clouds either internal or external as their primary computing platform."
- Unlikely predictions. Such predictions typically represent significant changes, but with a more uncertain likelihood of coming true. Unlike the publicity-seeking predictions I alluded to earlier, these are ones that might well come true if a given series of events take place. Keying off the earlier predictions, such a scenario might be: "As computing devices are moving quickly to mobile, and the server moves to the cloud, Apple will return to the enterprise computing space and begin delivering cloud servers bundled with proprietary software that will uniquely well handle its end-point devices." As I said, less likely to happen, but significant if it does.
Those "unlikely predictions" changes drive industry-transforming repercussions. While not the same as "Black Swans" in quantitative forecasting, they do represent changes that, if successfully brought to fruition, will reshape the technology landscape. At Price Waterhouse, we tended to avoid these predictions because they run a high risk of eroding credibility. (What did you think when I suggested that Apple would get into the enterprise game?) They also have a low level of benefit: What can you do with the information, even if it turns out to be right?
As an editor, I am often asked by vendors and readers for my take on where things are headed. People most want the interesting but unlikely prediction. But, as I mentioned, taking that bait is dangerous. Some people are insistent, however. So, the prediction I've been holding onto for the last few years in terms of significant change derives from trends I've already touched on: I expect that within the next five years, computing will become a utility-provided service. All devices will become portholes into our own personal PCs and data all of which will be on the cloud. (This is a vision that the Google Chromebook has as well, I might add.) The devices do only the locally necessary computing (graphics, compression, etc.) and everything else, including much of the operating system, remains in the cloud.
There are several steps on the way there, which I think are important, the first of which is the increasing use of the PaaS as a better platform for specific tasks. I discussed this in July 2013. The second is the advent of PaaS-oriented programming, where once again the PC serves just as a tool to host a dev environment that plugs into a cloud-based build and delivery system. Visual Studio 2013 has taken a major step in this direction. The final stage will be the point at which PaaS-hosted work is the norm.
In the future, I'll examine the benefits and difficulties of this kind of computing.