P-C Insurers Are Only As Good As The Quality Of TheirData

|

Since data is the lifeblood of the property-casualty business,more than a few insurers could find themselves in need oftransfusions when poor data quality threatens their competitiveadvantage and bottom line.

|

The cost to U.S. p-c insurers of poor quality data is high.Measured in overpricing, underpricing, writing too many bad risks,regulatory fines or simply losing business from disgruntledcustomers, poor data can mean poor financial results.

|

A recent study by The Data Warehousing Institute, aSeattle-based research organization for the business intelligenceand data warehousing industry, highlights the dimensions of theproblem on a broader scale. Poor quality customer data costs U.S.businesses a stunning $611 billion a year and this estimate doesn'tinclude the cost for poor data used in financial, sales, decisionsupport and other areas.

|

The importance of preventing errors in reporting, collecting andmanaging data for p-c insurance cannot be overstated. Consider thisreal-life example from the Data Warehousing Institute study of aninsurance company that receives two million claims per month with377 data elements per claim. Even with an accuracy rate of 99.9percent, this carrier's claims data contains more than 754,000errors per month and more than nine million errors per year.

|

If only 10 percent of the data elements are critical to theinsurer's business decisions, that means the insurer must correctalmost one million errors each year. At a very conservative $10 pererror resulting from excessively high or low claims payouts (not tomention staff time to fix the errors downstream, as well as lostcustomer trust and loyalty), the company's exposure to poor qualitydata is $10 million a year.

|

Because data is critical to an insurer's ability to executefundamentals of sound underwriting and cost-based pricing, theyinvest significant (if not always adequate) amounts of money tomaintain and improve the quality of data their essential corporateasset. Unfortunately, in today's uncertain and highly competitivebusiness environment, carriers seek competitive advantage inexpense control, as well as improved underwriting results andinvestment gains. For some, no doubt, it is easier to reducespending in areas where return on investment is not immediatelyapparent than in areas where it is.

|

Data is a corporate asset companies need to manage with the samerigor for accuracy, reliability and quality as they do theirfinancial and material assets. As businesses go increasinglyglobal, the use of data has grown exponentially. Bad or poorquality data used in decision-making has severe financial andregulatory consequences that go beyond just lost business anddisgruntled customers.

|

Our industry lives and dies on the quality and credibility ofits data, the raw ingredient for all p-c insurance products.

|

Consider how data is the very underpinning of our business. Datadrives insurer decisions regarding underwriting and pricing risks.Data is at the heart of all customer service and support functionsit determines the level of support insurers must provide to retainand grow their customer bases. Claims data (from both internal andexternal sources) supports decisions and planning for theclaims-settlement process. Past claims experience helps companiesidentify emerging claim trends, predict future claims-adjustmentand settlement processes, develop contingencies for catastropheclaims, and identify problem areas for strict loss-controlmeasures.

|

Data also helps insurers manage litigation, detect fraudulentclaims and limit financial exposure to claims through reinsurance.Moreover, data provides the yardstick that investors, regulatorsand internal planners use to measure a company's financialhealth.

|

Like any major corporate asset, data needs to be properlymanaged and controlled. If data assets are neglected or poorlymaintained, the reliability, availability or timeliness of aninsurer's data will be in doubt and its value questioned,eventually hurting the company's financial stability and marketreputation.

|

Managing Data

|

So what can insurers do about managing data and ensuring dataquality?

|

From our perspective, it is critical that insurers adoptspecific principles for data-management best practices. TheInsurance Services Office Inc. operates one of the largest privatedatabases in the world for p-c insurance information more than 9.3billion records at any given time and it represents up to 75percent of the entire industry's premium volume for commerciallines and about a third for personal lines.

|

Insurers entrust their data to ISO for use in developing a widerange of services, such as statistical analysis, actuarialservices, underwriting, claims-fraud detection and other claims-and loss-related information. Data from participating insurers helpISO develop advisory prospective loss costs the essential tool forcompanies to underwrite and price risks accurately and ensurecoverage is available to insurance buyers.

|

We define quality data as data fit for its intended use. (Seethe accompanying “Five Cornerstones of Quality Data.”)

|

From ISO's viewpoint, the following principles of data qualityconstitute best practices in data management.

|

o Data stewardship. Maintain a corporateprogram with senior management-level oversight, if necessary, tounderstand the roles and responsibilities in data ownership,acquisition, quality assurance, storage and distribution. Make eachfunctional area with data responsibility accountable forperformance and good data management.

|

Data and data-quality standards. Developinternal standards and seek, where appropriate, useful externalstandards. Harmonize multiple standards and promote operationsacross multiple systems and platforms.

|

Organizational issues. Establish a cohesivedata-management and data-quality function for creating data andassessing data acquisition across the organization. Considersupport for assessing and improving data quality by tapping intoresources of outside organizations, where appropriate.

|

Operations and processes. Develop processes tomaximize data quality and usefulness. Use new technologies fordata-management and data-quality processes. Consider variousinternal and external data sources for improving data quality.Monitor changing regulatory requirements that may affect data anddata quality.

|

Data-element development and specification.Design and maintain data, system and reporting mechanisms topromote good data management and quality, and to serve end-userneeds. Ensure that the definition of new data is in sync withunderlying business processes and its broadest possible use.Consider the level of detail in data and whether historical orretrospective data is necessary for developing system or reportingspecifications. Design data and data-reporting requirements foreasy modification and updating.

|

Data-management and data-quality tools. Developtools to promote good data management and data quality, including acorporate data dictionary, edits and business rules, data-flowdocumentation, process model and mapping, and data-translationcriteria by data source and recipient.

|

In addition, adopt new technology resources, such as theInternet, predictive technologies, data-visualization tools anddata dictionaries, and new data exchange standards, such asextensible markup language (XML), to improve data management andquality.

|

Make third-party data-management, data-reporting anddata-quality tools such as statistical edit packages andstatistical plans part of the workflow process at the corporatelevel.

|

Measurement. Develop a performance metric tomeasure poor data quality, such as the costs associated withcorrecting errors and reports, investigating and preventing errors,bad decisions, missed opportunities, fines, and increasedregulatory scrutiny. Measure and benchmark results for each datasource.

|

Individual support. Institute support for bothdata management and data quality on individual and organizationallevels, and follow standards of professionalism.

|

Privacy issues. Educate users about privacyissues, policies and compliance with privacy regulations. Controlaccess to, and use of, nonpublic data and adopt best practices oforganizations that promote data privacy.

|

The upside of implementing data-management best practices isthat insurers can leverage their data for newer and more varieduses and unlock the full value of their data especially with theavailability of new technology tools to improve data quality.

|

Technology is playing an increasingly major role indata-management processes and workflow. ISO has gained a lot ofexperience in using technology for its statistical plans anddatabases for personal and commercial lines of insurance thatinsurers depend on for actuarial services, underwriting, ratemaking and pricing coverages.

|

Elements underlying ISO's sophisticated data-editing systeminclude:

|

Statistical front-end editpackages that check data received from reportinginsurers for validity.

|

Distributional edits and actuarial checks toperform data accuracy and reasonability checks on statisticalrecords sent by insurers.

|

ISO is also researching new data-exchange standards andtechnologies as new software innovations and greater data storagecapability on more powerful computers and at lower cost are makingdata collection processes more efficient. With instantaneous accessvia the Internet and common frameworks to consolidate and transmitdata, insurers are better positioned for their data qualityefforts.

|

Several new uniform data storage and transmission standards aremaking it increasingly easy to meet data exchange and reportingrequirements. The growing trend toward an electronic datainterchange with batch, store and forward functionality and thepopularity of XML both underscore the importance of datastandards.

|

For example, ACORD has developed globally recognized commonstandards for insurers and industry associations to collaborate andexchange data in a uniform framework. ACORD has created XMLstandards for the p-c industry to meet real-time data-transactionrequirements for personal, commercial and specialty lines, forpremium as well as claims transactions. ISO plans to leverageACORD's XML standards to structure data in a common, consistentformat, making it potentially easier to collect and aggregatesimilar data accurately from multiple data sources.

|

In a fast-changing global business environment, insurers must beprepared to respond to challenges and identify opportunities forgrowth. Quality data is the foundation of sound decision-making andcompetitive advantage.

|

By managing data more efficiently through best practices in dataquality, and in conjunction with new technologies and emergingdata-exchange standards, insurers demonstrate that they recognizethe value of data as a corporate intellectual asset an assetrequiring as much safeguarding as their traditional physical andbusiness assets.

|

Carole J. Banfield is executive vice president of InsuranceServices Office Inc. in Jersey City, N.J.


Reproduced from National Underwriter Edition, May 28, 2004.Copyright 2004 by The National Underwriter Company in the serialpublication. All rights reserved.Copyright in this article as anindependent work may be held by the author.


Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.