Think Client
Dumb client, thick client, smart clientto make the most of computing power, thin may be in, but there are benefits to being smart.
BY PAUL ROLICH
Here we go again. Computer science is incredibly simple and straightforward, yet it seems we always are creating new methods of doing the same things. We currently are stuck in a digital world. The next great leap forward in computer science will be analog computing. For now, everything ultimately comes down to 0s and 1s, ons and offsdiscrete quantum-like packages of information. This means at the machine level, we have been doing exactly the same thing for the last 60 years: moving, jumping, copying, adding, and subtracting. The rest is all fluff or hyperbole. And that means we collectively feel the need to keep reinventing the wheel, so to speak. I mean, if we were all still doing drop-down programming with branching, we wouldnt be worth nearly as much as we are paid, would we?
This need to keep reinventing things inevitably leads to new paradigms for program design. One of Microsofts current new paradigms is called smart client. Smart clients are easily deployed and managed client applications that provide an adaptive, responsive, and rich interactive experience by leveraging local resources and intelligently connecting to distributed data sources.
Back when I started in this business all processing and data access was accomplished on the server (mainframe), and the only client we had was a card reader or a line printer. You know the rest: Dumb video terminals came next and then smart terminals that had a processor and were able to accomplish rudimentary processing on data before it was sent to the mainframe. When PCs came around in the 80s, we had to learn an entire new way of doing business. Each knowledge worker was able to harness his or her own processing power, and we learned to create stand-alone productivity applications. First-generation PC applications were stand-alone programs that allowed users to manipulate data in documents, spreadsheets, and databases. Before Windows, each application had to handle all peripheral devices on the PCfrom printing to writing to the screen. When Microsoft brought us Windows with a readily available API, PC programming became fun. For the first time, programmers easily could reuse already created code to control input and output devices on the PC. These stand-alone PC applications became known as thick clients, although the term would have been meaningless at the time.
Along with Windows came a whole new set of programming toolsVisual Basic, MFC, Borland C++, and Delphiwhich brought a whole new generation of programmers into play. These guys didnt need to know assembly language or how to manipulate a pixel on a screen. Object-oriented programming created a level of abstraction that took us another step away from machine dependence. While all this was going on, the business community was beginning to question the real value in the PC. With each knowledge worker in the enterprise operating as a stand-alone processing center, productivity was being lost. PCs were turning managers into secretaries. The programming community responded by creating a new PC paradigmn-tier applications. Essentially an n-tier application worked like this: A thick client program on individual workstations accessed corporate data on a network server. Two-tier applications performed all processing on the client side. The remote database server was treated just as if it were a local drive. It wasnt until DCOM (Distributed Component Object Model) came along that processing began to be distributed along various levels of the architecture. Passing procedure calls from one computer to another was a tough nut to crack, but when it was accomplished, we finally had true enterprise-worthy applications. DCOM, Java, EJB (Enterprise Java Beans), and other object models provided the ability to create applications that could allow workstation applications to access middle-tier applications, which would in turn access enterprise databases. Some of the processing was transferred to the middle tier. Middle tiers were designed to handle all business logic. Theoretically, the client could be replaced by another client as long as the new client made the same procedure calls to the middle tier. This also had the effect of removing functionality from the PC or workstation, thus the client application began to get thinner.
Thin Client
The next step was inevitable. If we can move some of the processing off the client machine, why not move all of it? Thin client applications were born. Essentially a thin client application is one that uses HTTP to connect a client machine to a remote server where all the important processing is done. The client machine uses a shell application (usually a Web browser) to access the remote application. Thin client means all I need is a compliant browser. Technologies that enable effective thin client applications are Microsoft .NET, Sun J2EE, along with standard protocols such as HTTP, SOAP, and XML.
So What Now?
We have run the full gamutfrom the mainframe world, where all processing is done on the server, to stand-alone PC applications, where all processing is done on the workstation, back to thin client applications, where all processing is done on the server. We have accomplished all of this at the same time we have created extremely inexpensive workstations with killer processors, tons of inexpensive RAM, and virtually unlimited hard-drive space. I sense a bit of a paradox here. And so does Microsoft. Its suggestion is to combine the best of thick clients with the best of thin clients into the smart client model.
Smart Clients
Consider the best features of thin client applications: easy access (any HTTP connection), ease in deployment, and easy change management. Combine these with the advantages of thick clients: rich user experience, unlimited access to local processing power, and the resultant quick response timeand we have smart clients. OK, that is the hypeand the hype is 100 percent valid. Given the current state of available hardware and network (Internet) connectivity, it makes no sense to limit ourselves to one design model. But we all knew that anyway. We didnt need Microsoft to tell us that. If we cut through all the marketing hype, we can arrive at the single-best, killer reason to explore the world of smart clients: mobile data synchronization. Microsoft has created a series of classes or modules on its .NET architecture called the Smart Client Offline Application Block.
A substantial segment of the insurance industry work force plugs away from its office in the routine performance of its job. Whether we are talking about sales representatives or claims processors or adjusters, we have a similar need, that is, the need to access the same business software when we are disconnected from the network we use in the office. That means a substantial part of the functionality of that software must reside on the individual mobile computer. Then when that computer is reconnected to the network, seamless data synchronization with the real data must occur. Microsoft provides a set of tools to accomplish this with the Offline Application Block. In spite of my early complaints about fluff and hyperbole, this really is pretty cool stuff.
Offline Application Block
A smart client application using the Offline Application Block consists of a Windows Forms Application (the smart client) connected to a Web service through a series of .NET components or classes that contain the hard-core plumbing to make it all work. A typical Offline Application Block will have a service agent manager, a connection state manager, a reference data manager, and a message data manager. Consider an application that requires high availability: a bank teller application. Modern teller applications depend on connectivity to the enterprise database. Transactions still actually may be posted in a batch run during the midnight hours, but they are queued up with direct access to data. To get to the point: If the teller application were a smart client application using the offline application, it would continue to work in a disconnected state. The connection state manager would recognize the loss of the HTTP connection and immediately trigger the data manager blocks to begin using local database access, perhaps the SQL desktop engine (MSDE). Sure, some functionality would be lost, such as the ability to query account balances, but a well-designed system will allow the teller to continue operating in a disconnected mode until the time the connection is restored. When that occurs, the Offline Application Block allows the locally collected data to be integrated correctly with the corporate data store. It doesnt matter whether we are talking about a bank teller application or a claims-processing program or a client-management tool. They all would function much better in a world that allows (almost) full functionality in an offline state.
That offline plumbing provided by Microsoft is the compelling reason I currently see for exploring the smart client paradigm. The rest is all common sense. We need to design systems that make effective use of computing power wherever it may be. Those systems may use grid computing or smart clients or whatever. If your application is for a lightweight (in terms of processing power) computer like a PDA or cell phone, then let the server do the heavy lifting. If your application is designed for a 3GHz PC with a half GB of volatile memory, then you might as well use some of that power. At the end of the day, dont get too worked up about the latest techno-speak. It really is just 0s and 1s.
© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.