At one time, insurance companies were users of bleeding edge computer technology. Shortly after the U.S. Army began to use the first mainframes to compute shell trajectories for heavy artillery (before anybody knew what a mainframe was), insurers began using those room-sized behemoths to calculate mortality rates and actuarial tables. Later they were the first to use second-generation mainframes to store customer data. Only the federal government had bigger databases than underwriters and those pioneering insurance companies who needed every bit of binary storage they could lay their hands on.
Banking and insurance provided willing customers for mainframe computer manufacturers in the 1950s. Many of those early mainframes, or at least their direct descendants, are still in existence, and our industry is no longer considered a front-runner when it comes to technology.
What happened? Is the insurance industry truly mired in the technology dark ages or are we simply using the resources we have wisely?
The three primary uses of computers in the business world are to calculate, to store data, and to interact. These three tasks not only define why we use computers, they also parallel the actual development of computers.
The first two items, calculation and data storage, were accomplished with the early hunks of big iron, which were characterized by punch cards and reel-to-reel tape storage.
True interactivity among computing systems required the existence of a critical number of systems and that required the microchip revolution. The creation of microchip technology allowed us to build and use entirely different kinds of computing systems than we had previously envisioned.
The once radical idea of a computer on every desktop is now a reality, and it has changed the ways in which we think about computers. The question is, does that revolution make all our previous assumptions about data information systems invalid? Does Moores Law dictate that we will always be stuck with legacy systems that we need to divest ourselves of?
I dont think so. There is a fundamental difference in the way we make business decisions regarding information systems, and it is based on what we have in place and the investment we already have in our legacy systems.
Lets talk briefly about our three uses of computing systems.
Crunch, Crunch, Crunch
Before computers existed, most of the literature discussing them envisioned calculation engines. Various mechanical calculation machines had been described and occasionally built for centuries. The discovery (or invention) of the calculus created a nearly impossible task for human computation. Certain mathematical principles require calculations that can never be exactly knownthe value of pi, for example. These calculations require that we approximate the correct values by iterative calculations that approach the real value but never achieve it.
Statistical analysis also requires repetitive calculations using existing data that are cumbersome and inefficient when done by man, but can be efficiently accomplished with a machine. These are exactly the types of problems that the first true computers were used to solve. Shell trajectories were approximated using the calculus, and actuarial tables were created using statistical analysis.
We all know that present day spreadsheet software can perform incredible calculations that surpass first (and second) generation computers. What we dont always know is whether it makes good business sense to blindly upgrade our working 10-year-old system to the latest iteration of hardware and software just so we can be cutting edge.
A few years ago we (The National Underwriter Company) created and sold a small bit of software that we called the Tax Facts Calculator. This was a nifty little tax calculator application written in Visual Basic 3 that was out of date soon after it was released (in case you havent noticed, tax laws are subject to frequent change). After an initial release we quit selling it and a few years later we quit supporting it. Not only was the technology dated, but it was too costly to maintain accurate, up-to-date calculations.
The interesting fact is that certain calculations in that ancient VB3 application are still valid. Even more interesting is that it is still in use; at least one major insurance carrier has requested a license to use that discontinued piece of software simply because there is nothing better available at a reasonable cost for certain calculations. The point is simple: In some circumstances, the best solution to a particular data processing requirement is one that is already in place. Look long and hard at the real costs involved before upgrading to a new system.
Storehouses of Knowledge
The primary uses for mainframe computers quickly changed from crunching numbers to storing data. We need accurate data about our customers if were going to adequately serve and retain them. We also need data on prospective customers if we intend to grow our business.
Early computers used reels of tape to store this data. Modern technology has replaced that tape with silicon, but the basic systems and their underlying principles remain the same.
Massive banks of flat data files manipulated by languages such as COBOL (Common Business Oriented Language) and RPG (Report Program Generator) have been the standard methods of processing large amounts of data for more than 40 years. (More than 35 for RPG.) COBOL allows us to perform simple mathematical functions and common business processes on our data using conditional logic. RPG is simply a method of displaying your existing data in ways that are useful for human consumption.
Even modern mid-range computer systems are nothing more than 1960s mainframes with new bells and whistles and technologies. The heart of these systems is essentially the display and analysis of data using batch technologies (even though they may not look and feel like batch processing). We may not use keypunch machines to enter data, but by and large most of those data are still entered into the system by a human operator at a keyboard.
What Im describing here are true legacy systemsthe systems that have been the heart of our industry for half a century. Whats at issue is whether we should be running away from these systems or embracing them. Once again, the answer lies within careful analysis of your own business and its requirements.
Suppose the primary information in your databases is customer information: names, addresses, dates, histories, contact information, etc., and your customer database consists of millions of customers and prospects. Then suppose that you are currently able to access that data in a way that supports your current business needs. At this point, your decision on what to do with this legacy system is largely a matter of calculating ROI: What difference in cash flow will you experience when that legacy system is replaced with a scalable relational database complete with up-to-date systems for getting at that data?
Have your business goals changed significantly over the past half century? Have they changed significantly enough to require a complete rethinking in the way you use your data processing systems? If those goals have changed, then perhaps you should dump your legacy systems, regardless of short term ROI.
All Together Now
It didnt take too long to discover that an isolated computer has limited functionality, no matter how powerful it is. Connecting computers together, even for discrete moments in time, creates an entirely new world of possibilities.
EDI (electronic data interchange) provides for standards whereby one computer can talk to another. My banks computer can transfer funds from my savings account to my brokerage account. A companys mainframe in Connecticut can compute payroll for the entire enterprise and print checks for the Honk Kong branch in Hong Kong.
We quickly learned that the power of two computers working together is more than the sum of their parts. Properly interacting systems multiply the benefits of computing power exponentially. The examples above only require limited connectivity between computers. But todays world allows continuous connectivity via the Internet or dedicated data links. Add to this the incredible increase in the sheer numbers of computers that are connected to those networks, and the possibilities become truly astronomical.
And therein lies the problem. In an effort to tap this vast resource of networked computers, we have tended to create false expectations of the short-term results attainable from that network. Many firms have rushed to have an Internet presence in the expectation that it would directly translate into increased revenues. One automaker recently scaled back its massive Internet initiative in response to decreased revenues. Does this mean it honestly expected its Web presence to increase revenues?
I certainly believe that virtually all major businesses require some sort of Web site and can make good use of modern network systems. What I dont believe is that we need to immediately and completely redesign and remake our existing systems to conform to Scott McNealys the network is the computer model.
Insurance companies have long been enamored of the SEMCI (Single Entry Multiple Company Interface) model. The whole concept of SEMCI is based on the ability of disparate, disconnected computers to communicate. Straight-through processing is the ideal company system that would allow customer information to be entered once.
If I were designing such a system from scratch, I would undoubtedly embrace technologies such as Java, CORBA or .Net. The real world dictates that we already have most parts of a straight-through processing system in place, and that all we really need to do is to use the new technologies to integrate those systems. (I do not suggest that this is a trivial task.) In fact, there exist such systems that make extensive use of existing legacy systems.
I suggest that we not become so wrapped up in the hype surrounding networked distributed systems and the Internet that we ignore that value of the investment we already have in our legacy systems. Take a hard look at what you have and how well its doing what it needs to before abandoning a working system.
The Secret of Your Success
As we make these judgments on how to deal with our legacy systems, we must tread warily. The success of certain companies using new technologies should not lead us to assume that adopting similar technologies will be the magic bullet for every company. The success of Amazonif you define churning a lot money to be successhas turned many a head and convinced many CEOs that technologyInternet technology in particularwill replace flat revenues with exponential growth.
If pure technology is the answer, then all those well-funded dot-coms would still be around. Good business decisions, not technologies, ultimately define the success of any individual firm.
© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.