I was at home one day ranting about the iPad around the time it was newly announced–pointing out the obvious to my captive audience. You know the drill: It is nothing but an overgrown iPhone/iPod. . . .It is way too expensive. . . . It requires some sort of data plan, etc. When I finally was done, my wife politely asked me, "Well, what is the next big thing." I suspect my answer was flippant and insincere, but her comment made me start thinking. Even though we live by Moore's Law, there actually haven't been any recent, significant breakthroughs in the world of information technology. Microcircuits have gotten smaller and more efficient, displays have evolved, hard-drive technology is improving, intermachine communications is expanding; but a substantial leap in technology hasn't happened since the invention of the transistor. Moore's Law is about engineering refinements, not about technology breakthroughs. I suspect the next big thing probably will have something to do with new ways of generating and transporting energy. But that may just be wishful thinking.
Then there are applied technologies about which the computer press always buzzes and consulting firms create outrageously priced white papers. I mean, does anyone truly think the cloud or virtualization are the next big things? Just because CIOs finally are starting to realize there may be a place for both in their organizations doesn't make them new or cutting edge. Don't get me wrong. I am an advocate of both–in fact, a heavy user of both–but both have been around for quite some time.
My guess is business intelligence will be the hot topic for applied technologies in 2010. What is interesting about these three topics is how they are interconnected. The cloud and virtualization are using technology to reduce the on-premises footprint of machines and the peripheral equipment associated with them (data centers, chillers, power sources, switches, etc.). The push for business intelligence is a desire to justify the data processing footprint we already have. Given all corporate data resides on databases and other forms of electronic content, why can't we surface that data in such a way as to allow us to make meaningful business decisions based on that data? So, we have three hot topics, and they all revolve around efficient use of computing resources.
Too Many Devices
Which leads me to what I really am here to discuss: My iPad rant was just an extension of the frustration we all feel. How many individual devices do we need? Collectively, we spend a lot of time and energy trying to reduce the number of physical machines in our data centers but generally tend to pay less attention to our individual and personal devices. In the course of an average day I used the following:
o BlackBerry–phone, e-mail (three accounts), calendar, address book, GPS.
o iPhone–all of the above plus social networking, Kindle, Internet, and 30-plus apps.
o Netbook–all of the above plus productivity applications, tax program, online banking, remote desktop.
o Work laptop–most of the above plus programming software, code repository, CRM, PM tools, advanced editing tools.
o Various and sundry laptop and desktop machines used for a little bit of everything.
o Approximately 15 virtual machines that run the gamut from development servers to replicas of various client systems.
Nice Work, Apple
Living in a wired world is fun, but I am starting to get device frustration. I cannot possibly remember all the portable devices I have used over the past few years, ranging from Windows Mobile machines to devices with proprietary software. I find it amusing Apple has set the standard for mobile devices with its iPhone/iPad gadgets. Encouraging software developers to create applications for the iPhone operating system not only has led to a cottage industry for them but also has enhanced the value of the product. I see businesses adopting the iPhone as the standard for corporate communications or, at least, welcoming them into the mix.
I recently had lunch with a CIO of a billion-dollar firm, which had standardized on the iPhone for corporate communications after years of trying to use other devices. The primary business driver was users were easily able to create applications for the device that displayed critical information about the operational status of various geographically dispersed plants and facilities. They had created similar applications for other devices, but these applications were not perceived as user-friendly. The iPhone application was. Furthermore, the firm is looking very seriously at using iPads to replace laptops that are currently in use. The laptops in question are primarily used for data entry. Suddenly, the "nothing but a big iPhone" rap makes them attractive. And if it works for one organization, it will work for others.
I am not suggesting Apple has discovered the Holy Grail of individual computing devices–far from it. What I am suggesting is maybe it is time we start to rethink how we design portable computing devices based upon how we use them.
All We Really Need
The basic requirement for a portable computing device is it can display information. The second requirement is it has a means of connecting to the Internet or a network so it can fetch data. The third requirement is it can accept user input and use that input to change data locally or remotely. The fourth requirement is it must have the ability to run various applications. That's it. Anything else is a refinement.
Consider the laptop I am using to compose this column. It is ridiculously expensive. It has two large, fast hard drives; 8 GB of fast RAM; a fast dual-core processor; as well as lots of bells and whistles such as built-in Web cam, Bluetooth, wireless capability, high-end graphics, etc. I like this machine, but do I really need a portable device with such high-end specifications? Not right now. Word processing is not very machine intensive. Never mind that I also have two e-mail clients open and am streaming satellite radio. The deal is, I do not require this much computer power to do what I am doing at this time. However, I do need that much power on those occasions when I am running two different virtual machines and debugging code on the host operating system.
Hence, the issue: We tend to specify user machines based on peak operating requirements. If a user ever needs to run two or three virtual machines (or even one VM with significant requirements) or multiple fat-client applications simultaneously, we provide a machine with high-end specifications. Maybe it is time to think about what we have done in the data center and apply it to the desktop/laptop/personal device.
I suggest a better approach would be to supply users/knowledge workers with baseline machines that provide the basic needs for a device and enable these users to connect to external systems that can deliver the additional computing power they require–only when they truly require it.
A Modest Proposal
Most software developers I know have long since discovered the power or configuration of their base system has little to do with their productivity. I have known a number of .NET developers who use an Apple operating system for their base machine, which they then use to connect to virtual machines configured with the appropriate Microsoft server system where they accomplish their actual .NET development.
And that makes me think I (or most users) don't really need to have machines with crazy specs. This is where the cloud and virtualization come into play for user machines. I rarely require the advanced features of Microsoft Office. In fact, I use Microsoft Works on my Netbook. I have legal access to the Office bits, but I don't want to bog down my Netbook with all the overhead. I can remote onto my killer laptop when I need the full stack of Office applications. Or I could remote onto a shared "desktop" machine provided by my employer and consume those shared computing resources. That "desktop" could be a well-configured virtual machine with tons of RAM and lots of connected network storage.
Imagine virtual machines specifically configured for each job function at your organization. Each virtual slice could be shared by a number of workers who would access that slice using a relatively inexpensive personal machine. Do you notice how we are combining the cloud and virtualization? The bottom line is computing resources are pricey in terms of capital expense, carbon footprint, licensing cost, and maintenance. Providing each corporate user with an inexpensive baseline machine that then can connect to the higher-end resources makes perfect sense.
In my world, most people I work with have portable machines (laptops, notebooks) that are relatively expensive. But when I walk around the office in the evening, I see cubicle after cubicle with a laptop in a docking station, powered up but doing absolutely nothing. Maybe those users really don't need a portable system. Maybe they would be better served with a lightweight desktop system that is essentially a thin-client machine that can connect to serious computing resources when the need arises. These machines and their associated peripherals should be powered down or hibernated when not in active use. I am not a known advocate of "green" systems, but I hate to see resources wasted. Once you commit to spend capital funds on hardware, you need to be able to show that hardware is, in fact, used in a manner that justifies the expense. If you are not joining all your resources to a grid to maximize the use of that hardware, then I suggest you scale down the capabilities of that hardware to meet average demands–not peak demands. Offload peak demands to more capable physical resources that can approach full utilization through sharing.
What do we do with our employees who truly require a mobile device? We need to provide them with a machine that meets the four requirements for a mobile computing device. However, it must not just satisfy those requirements. It must fulfill them well and in a manner that makes users want to use these machines. The devices can be small, but if they are very small, they must have the ability to be docked into an environment with a larger display device and a real keyboard. They must have multiple means of Internet connectivity. At a minimum, I would suggest a combination of a hard-wired network connector, 802.11X wireless capabilities, and an on-demand data service using 3G or similar technology. There is no need for extensive local storage, and since most heavy-duty computing will be offloaded there, we need only as much RAM as the operating system requires.
So, what does this device look like? I am not sure, but I would suggest something like a small touchscreen Netbook with a solid-state hard drive and a hybrid operating system that would combine the best features of the iPhone OS and Windows. Make it fast and lean and mean. And that, my friend, will be the next big thing as far as portable devices go. Oh . . . and I really don't care whether I require two devices. I don't see a problem with carrying a super-portable phone device that can meet 50 percent of my needs and a hybrid portable machine that can handle the rest. Do you know what would be really cool? How about a dual boot machine? Windows and iPhone OS? I doubt whether we will see that, but I just might buy one if it comes to be. TD
Please address comments, complaints, and suggestions to the author at
prolich@yahoo.com.
© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.