I was at home one day ranting about the iPad around the time it was newly announced–pointing out the obvious to my captive audience. You know the drill: It is nothing but an overgrown iPhone/iPod. . . .It is way too expensive. . . . It requires some sort of data plan, etc. When I finally was done, my wife politely asked me, “Well, what is the next big thing.” I suspect my answer was flippant and insincere, but her comment made me start thinking. Even though we live by Moore’s Law, there actually haven’t been any recent, significant breakthroughs in the world of information technology. Microcircuits have gotten smaller and more efficient, displays have evolved, hard-drive technology is improving, intermachine communications is expanding; but a substantial leap in technology hasn’t happened since the invention of the transistor. Moore’s Law is about engineering refinements, not about technology breakthroughs. I suspect the next big thing probably will have something to do with new ways of generating and transporting energy. But that may just be wishful thinking.
Then there are applied technologies about which the computer press always buzzes and consulting firms create outrageously priced white papers. I mean, does anyone truly think the cloud or virtualization are the next big things? Just because CIOs finally are starting to realize there may be a place for both in their organizations doesn’t make them new or cutting edge. Don’t get me wrong. I am an advocate of both–in fact, a heavy user of both–but both have been around for quite some time.