We often have waved the banner for standards in this magazine and in this column. Weve applauded the efforts of organizations such as ACORD, the W3C, and even software vendors for their endeavors in providing standards that will enable us to create efficient, interactive software systems. Weve lauded the goals of SEMCI and universal XML standards for the insurance industry. Before XML, we praised EDI and fledgling data standards like OLifE. We always have come down on the side of standards. An early incarnation of this column was called Standards Bearer. Maybe we went too far. Maybe we have been asking for too much of a good thing. This month, I am going to look at standards and see whether maybe we havent been trying a little too hard to standardize everything.

SEMCIWho Cares?
Look at this: SEMCI [single-entry, multiple-company interface] or what is evolving into STP [straight-through processing] means using one system, such as your agency management system, to enter data and communicate with multiple carriers and vendors for quoting, policy issue, and inquiry. That is straight from the horses mouth. Now this might, in fact, be the holy of holies for an agency, but it does not bode well for the carrier. Do you really want to further enhance the perception that certain types of insurance coverage really are commodities by providing the same interface as every other carrier? Homeowners and personal auto coverage increasingly are being thought of as commodities. And what is a commodity?

It generally is assumed to be physical substances, such as foods, grains, and metals, which are interchangeable with another product of the same type. The price of commodities is solely dependent on the laws of supply and demand. Is this where you want to be? I dont think so. If we only care enough about the services we offer to differentiate them from the competition by price, we probably wouldnt bother with advertising. We also have no interest in offering a product that is wholly and easily interchangeable with a similar product offered by another carrier.

SEMCI plays entirely into the hands of agents. It allows them to price-shop coverage. The proper response from a carriers perspective shouldnt be to lower the level of services provided with that policy so it can offer the agency the lowest price. The proper response is to differentiate the product in question from the masses so the individual at the beginning of the money chain (the actual purchaser of the policy) perceives a real benefit in the product offered and thus desires that product. The drug companies (I mean pharmaceuticalsnot the South American cocaine cartels.) already have learned this lesson. We all are painfully aware of the plethora of television and print advertisements aimed at individuals, urging them to ask their physician for such and such drug that either will enhance their sex life or lower their cholesterol or cause them to immediately lose 50 pounds. I personally find these ads reprehensible, yet they work. Prescription drugs have become commoditized, and the pharmaceutical manufacturers have responded not by reinforcing that perception but by differentiating their products. My point? Just this: We often have treated SEMCI as a technical hurdle to be overcome. In fact, it is a cultural hurdle, and your position on SEMCI probably is determined by where you stand in the food chain. (There is a finite amount of money involved in any insurance transaction. The money starts when the purchaser of the product writes a check and ends somewhere at the carrier or reinsurer level. The point at which you take your percent of that transaction will determine whether you perceive SEMCI as a good or bad thing.)

STPSticks to Pistons
You will notice the little blurb from the SEMCI folks also discusses straight-through processing. This is another conundrum. In terms of data-processing systems, it obviously is advantageous to create systems that flow from quotation to policy to inquiry. In fact, if you dont have internal STP systems, you are way behind the power curve. But does that mean we need to have universal standards for STP? Of course not. One carrier may have internally developed a system based on J2EE that ultimately connects to a decades-old mainframe. Another may have a .NET system that uses a multitude of totally disparate databases. The only ones that gain by having universal STP standards are software vendors. They only need to create a single product and sell it to everyone. But you know we are a suspicious lot, and I will wager there are a lot of senior managers out there who really dont want to use the same software system their competition across the river is using.

XMLThe Panacea for Everything
The XML hype has reached the ridiculous. Take a standard ASCII or unicode tagged meta-language, describe it with a DTD or a schema, and you have a formula for universal data exchange. All our interoperability problems are solved forever. End of story. Go sailing, collect your paycheck, and wait around for your gold watch. You dont even need to describe the data yourself. Organizations such as ACORD already have described all the data and transactions needed for the insurance industry. Rightand you believe there really were weapons of mass destruction.

Lets look at an area where adherence to standards is absolutely essentialair traffic control (ATC). I am fairly certain we all can agree standardizing the communications used in air traffic control is a good thing. There may be some room for a margin of error in controlling airplanes in the sky or on the ground, but the consequences of failure are profound and catastrophic. The standards we have evolved for air traffic control are multifaceted and diverse. At a high level, we universally have agreed English (Is that American English or the Kings English?) is the global standard language used for communication among aircraft and those who control their flight space. Further, there is a very strict subset of English words and phrases that have specific meanings in the context of air traffic control. Words or phrases such as abort or go around must have a single, unambiguous meaning if we are to maintain sanity and safety in the skies. (The military has similar standardsit wouldnt do to have different folks yelling cease fire or stop or hold fire when launching artillery shells.)

There also exists a lower-level infrastructure of standards for air traffic control. Some of these standards are not published, yet they certainly apply. A pilot knows to contact a certain area controller on a specific frequency. He must have a radio that will transmit and receive on that frequency using the proper modulation schema and with the appropriate power levels. (OK, I know there also are visual standards for air traffic control. In extremis, a plane can fly by the control tower and exchange various standard visual signals with the tower. I hope I am never in that situation.) It doesnt matter how the radio is built. It may use vacuum tubes or transistors; it may require 12 volts DC or 115 volts AC. What matters is it provides the ability to interact with other radio transceivers in an expected predictable manner.
So what does this have to do with XML?

We can consider XML as a high-level standard. We all can agree to communicate using World Wide Web Consortium-compliant XML (W3C). This corresponds to the air traffic controllers use of the English language. We then can agree to use a defined DTD (Document Type Definition) or XML Schema as the subset of data types we will use. One such subset can be foundin ACORDs XMLife standard. Now we have a standard methodology for communicating insurance-industry-specific data. Greatdo we really care? No matter what standard subset of XML we adopt, it always will require carrier-specific proprietary data. As soon as we start changing the standard to adopt it to our own needs, it is no longer a standard. Further-more, if we adopt a universal standard put together by committee (we all have been at one time or another part of a committee whose goal was to define some standard way of doing somethingand we all know the results), there is a very good chance adoption of that standard will require modifications or redesign of existing internal systems to use that standard effectively. Then we are tied to a standard over which we have no real control and which can be changed at any given time.

Real standards (like TCP/IP or ASCII) are not subject to change. These are the underlying infrastructure standards that allow us to build software systems able to interact with other software systems. (It is interesting to note TCP/IP Transmission Control Protocol over Internet Protocolwasnt created by committee.) If I start mucking around with packet length or authentication, I probably am going to jeopardize seriously my ability to communicate with other computers or systems. The TCP/IP standard is low level enough and effective enough that it has evolved into the de facto standard for intersystem communications. It doesnt require additions or modifications to work with your system or others. The success of these low level standards does not ensure we can create standards for everything.

As technology became an integral part of all business transactions, we made the mistake of extrapolating from effective standardization at a physical system level to expecting standards to work at every level of data processing and integration. This is an unreasonable expectation. Lets return for a moment to our air traffic controller scenario. Pilots of commercial airliners are in contact not only with air traffic controllers along their intended route, they also are in communication with the company that owns and operates the aircraft. This communication is based on a low-level standard (a working radio as described above), but it is not based on a high-level standard such as that applied to ATC, i.e., specific English. In fact, I suspect each air carrier has unique and proprietary data it ex-changes with its pilots.

We are in a similar situation in the insurance industry. We all have a need for computing systems that require low-level integration provided by various standards. We also have a need for certain high-level standards to communicate standard information effectively. EDI (Electronic Data Interchange) is an old and trusted methodology for communicating standard data. ACORD or ISO forms have become necessary paper standards, and it makes good sense to move those forms to standard XML electronic documents. But it may not make good business sense to push for universal industry standards for all data and all interoperability. This concept goes against my grain as a technologist, but that isnt the point. We all have a need for proprietary company-specific data. A significant business strategy is to differentiate ourselves from the competition. Once we admit we only need and require the same data and methods of transmitting that data as everyone else, we tacitly are admitting we are no different from the other guy. This concept stands defiantly against most technology wisdom. But it is not about the technologyits about grabbing that dollar from the food chain. Post Toasties and Corn Flakes taste the same to me, but back in their home offices, those companies are making every effort possible to differentiate their products. Dont let the standardization mantra drive business decisions. Technology is here to help the business processnot dictate how it functions.

NOT FOR REPRINT

© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.