I was speaking recently with the CIO of a multinational corporation about technology. We were discussing his plans to integrate technologies across some 150-plus global offices. One thing he said really struck home. He told me in his more than 20 years in IT, he never had experienced a failure on an AS/400. Sure, there was hardware that had to be replaced and the occasional software glitch, but in his career, he never lost any data because of a catastrophic failure on a midrange system.

Based on that, he said he intended to keep his data on iSeries machines as long as he could. Furthermore, he was developing a Windows-based thin client dashboard for the global enterprise built on Microsoft Office SharePoint Server 2007. That conscious decision to marry IBM and Microsoft technologies for global IT made me rethink some of my earlier opinions about the future of mainframes. Just last month in these pages I spoke somewhat disparagingly about the future of mainframe technology in an Internet world.

It also made me think about where we would be if the 1985 agreement between IBM and Microsoft had worked out a little better. The result of that joint venture, OS/2, was intended as a successor/replacement for DOS and Windows. The name came from IBM, which intended to launch the operating system on its new PS/2 line of personal computers. Both failed miserably, and Microsoft withdrew from the relationship in the early '90s. OS/2 gained some small success among anti-Redmond hobbyists and in voice-mail, telecom, and ATM systems. I actually have acquaintances who insist on using OS/2 Warp to operate their PCs. You notice I didn't say friends. I can tolerate only a certain amount of irrationality in my inner circle.

That CIO's decision to stick with "old" IBM technology and integrate it with Microsoft "new" technology made me think beyond my own prejudices, though. This column is named "Trends & Tech," and that implies we should be discussing the latest trends in technology. Just where is technology heading these days? Where can we expect to be a year from now? What are corporate IT departments concerned about? What should they be concerned about?

We seem to have reached a plateau in information technology. Silicon-based chips and processors have gotten us to our present state. And while that state continues to operate in some accordance with Moore's law, there really hasn't been much change in the last 20 years.

King Solomon complained about the state of man: "The thing that hath been, it is that which shall be; and that which is done is that which shall be done: and there is no new thing under the sun." He could have been talking about IT. Things we have discussed as trends in this column over the years come into general use: virtual machines, blade servers, SANs, distributed computing, XML, SOA, etc. Those "hot topics" are old hat, and they probably were when we first wrote about them.

Information technology is waiting for the next paradigm shift–for the next quantum breakthrough that once again will revolutionize the world. The personal computer changed the way we do business forever, but the PC did not represent a technological breakthrough. It simply was the logical culmination of two things: Claude Shannon's seminal works on information theory in the late '40s and the creation of the first transistor in 1947. Shannon's two-part paper, "A Mathematical Theory of Communication," was published in 1948. Coincidently, both events occurred at Bell Labs, which makes you wonder what we would be getting out of Bell Labs today if it had been permitted to exist and been funded as it was in its heyday. Regardless, the personal computer was the inevitable result of those two occurrences.

Two circumstances must exist for truly new technologies or paradigm shifts to happen: First, there must be the theoretical idea something could happen if certain conditions are met. Information theory was such a concept. Second, the ability to create machines that apply that theory must be feasible–the creation of the transistor was the engineering technology that allowed Claude Shannon's theories to grow to fruition. So, really significant changes in the way man interacts with his environment require the big idea–the new theory–and the implementation of that theory. The implementation often is just engineering or applied science.

Which leaves us wondering: What is the new idea, or the new theory, that is going to change the way we process information? Is that even a valid question? It just may be the current state of the art–an art in which we transform all information into a digital state and then perform operations on that digital state. This would indicate the current state already has reached its highest achievable plateau and the only change we can expect is better hardware on which to run these digital systems.

Discussions of quantum computing provide an interesting diversion because then there are more than two possible states for a bit of data. Digital world has us stuck with 0s and 1s. Quantum bits, a/k/a qubits, possess an infinite number of possible states. Will quantum computers transport us away from the limitations of digital computing? I am not sure.

I also am not sure we even need to be trying to break free from digital computing. For the purpose of running a business that requires certain necessary components–such as payroll, accounting, risk assessment, performance indicators, and so on–digital computing probably is the best solution.

We confuse the issue when we try to go beyond the obvious limitations of information theory and information technology. The current state of the art is what it is, and speculation about things such as artificial intelligence transcends the current paradigm. We must not make the assumption we can observe synapses firing in the human and animal brain, and therefore we can duplicate thought or intelligence by creating massive parallel machines that mimic what we have observed. The real world is arguably analog–not digital–and it is fallacious to assume we can duplicate things such as human intelligence with binary bits.

So, let's accept the limitations imposed on our art for the time being and see in what direction we should be heading. I get to talk to a lot of people in various industries who are making those kinds of decisions, and I keep hearing the same thing over and over again. Virtually every business has multiple applications running on different platforms, built on different technologies, acquired at different points in the corporate life cycle–all of which are absolutely critical to the success of the business. Every company I talk to wants to integrate its existing systems into common functional front-end systems that can be accessed by any worker, any time, anywhere. And that always boils down to a single common factor. Everyone wants to be able to do business over TCP/IP using HTML. The Internet isn't going to go away; legacy applications aren't going to go away; the fact that Microsoft owns the desktop isn't going to go away. For the foreseeable future, more resources will be spent on integrating existing applications using Web clients than any other trend in technology. I know you already are doing it, and I know you will continue to do so.

The upshot of all this integration is vendors are being forced to work in collaboration with each other. No single vendor has a product or product line that is so overwhelmingly entrenched it can afford to ignore the competition. As I said before, Microsoft owns the desktop. No one owns the data–Oracle, IBM, and Microsoft all have large islands of users. There are many line-of-business applications–multiple ERP, HRM, CRM packages available and in use. All of these islands need to be integrated. I say "need to be" because the vendor that chooses not to integrate with the other players will be forced into other ventures such as making MP3 players and telephones.

Look for vendors to start playing together nicely. Look for things such as Windows server systems running on iSeries machines. Look for Larry Ellison and Bill Gates to go sailing together. (Just kidding–wanted to see whether you were paying attention!) There are enough IT dollars floating around to support a lot of quality software. Right now, corporate end users are developing integrations between disparate systems. Many of those integrations are one-offs, which is a terribly inefficient way to go about things. It is time for vendors to step up and start offering workable cross-platform/cross-vendor integrations. And I am not talking about Web service layers that allow a peek inside the system.

So, if most organizations already are on board with the trend toward integration, what is the trend many are missing? Training. When you upgrade your internal systems to Vista or Office 2007, your internal help desk will be overwhelmed if you haven't properly trained the rank and file. It is my experience most knowledge workers are barely computer literate. They have learned how to do their assigned tasks by watching and mimicking other workers. Take away their familiar environment, and they will be lost.

Vista represents the first real change in the Windows desktop environment since the delivery of Windows 95. And it is significantly different enough that major hardware vendors were forced to bring back Windows XP for new machines. Consumer buyers didn't like the change. Office 2007 represents the first real change in user interface since Office debuted. And once again it is very different. And different means bad to most users.

I have had the privilege to listen behind the scenes to the help desk at Fortune 20 companies. The level of computer incompetence among corporate users is astonishing. I have heard highly compensated workers being walked through how to plug their new $3,000 laptop system into the electrical grid. One user actually said, "I will not crawl under my desk to plug this in. Please send a support person." That one ended in a stalemate. I have heard users talked through plugging a CAT5 cable into their laptop–that took about 20 minutes. I have heard tech support on a four-hour call talking someone through connecting to the Internet while at home. In case you're wondering, I never make up stories.

For some unknown reason, most organizations assume new hires have some basic computer literacy. Big mistake. All employees should be tested and, if they are found lacking, should be forced to undergo a modicum of training. We all have dealt with senior executives who scorn the PC on their desk, saying they don't require a computer to manage their company. If you take the time to find out what their problem is, you usually find out they don't know how to use the basic productivity tools and are ashamed to admit it. Even CEOs can benefit from running what-if scenarios using a spreadsheet application.

As we purchase and install new productivity software, it is absolutely essential users are properly trained on how to use it productively. If they don't use it productively, it really isn't productivity software, is it?

So, there you go. The two major trends I see in IT today are integration and training. The rest is all spin and buzz, sound and fury. Your experience may vary. But I doubt it.

NOT FOR REPRINT

© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.