We asked four leading insurance industry analysts the following question: What are the important steps insurers need to take to improve the quality of their data in preparation for the installation of analytical tools?


Craig Beattie
Senior analyst, insurance, Celent

Improving the quality of data after it has been captured has always been a challenge. The first step is to improve the quality at point of capture. This can mean making it easy for customers to enter the right data, the use of appropriate validation rules, and incentivizing staff to capture the data accurately.

There are possibilities for improving existing data as well. A simple example is to take address data and use a tool to geo-code the address. This can highlight missing data and bad addresses, which could be dealt with through exception processing. The use of tools that look for valid names of individuals is another good example as a side benefit to the rise of the use of credit scoring in some countries.

The third step is to tap into those looking after the system and capture the rules about the data that sit in people's heads. This kind of corporate memory allows companies to operate well, but someone needs to let the analytics tool in on the secret.


Karen Pauli
Research director, insurance, CEB TowerGroup

Among our customers, the best outcomes started with appointing an executive to head up data governance. One person has to have ultimate authority for data quality and data governance at a corporate level or it becomes an endless team exercise in turf protection and perfectionism.

A second point of consideration is understanding how core systems modernization aligns with data goals. If legacy data inputs are broken, then governance will not help. Our customers that brought core systems modernization into the plan have been more successful.

The third success factor is to establish a partnership with a technology provider with strong, if not sole, focus on data quality. Partners with experience helping other insurers with data initiatives can shorten time-to-value. Technology partners that bring best practices are valuable. The "to do's" are as important as the "don't do's." Experience matters.

The final important point is to not boil the ocean. Having a short-term target for business value makes a difference. Knowing what business problem analytics adoption is to solve, then targeting the data initiative to facilitate this is a winning strategy.


Martina Conlon
Principal, insurance, Novarica

A critical step in avoiding the old "garbage in, garbage out" dilemma in a new analytics environment is to conduct a proactive source system data assessment. Export data from core systems into temporary files or databases of any structure and use a basic statistical tool, or even SQL, to assess the data values.

For numeric fields, calculate minimum values, maximum values, sums, and averages. Or for code values, calculate frequency distributions. For one-to-many relationships, calculate typical relationship ratios, such as the number of locations per policy. Review results with the business to validate that they make sense.

If possible, utilize the data assessment features on the modern data quality tools to validate formats and contents of text fields. Use this information to build a data profile, and determine how you will address any specific data issues uncovered (in the source system, or during the transformation).


Bill Jenkins
Managing partner, Agile Insurance Analytics

Insurers have spent millions of dollars on data integration, data mastery and data warehousing initiatives only to be disappointed in the results returned from these investments. A main cause of the disappointment has been the inability of the organization to properly manage the underlying data to be used in these solutions. The lack of high quality data has been the leading culprit for these unfulfilled expectations.

As BI and analytics expand, and the need for high quality data in the use of internal data, external data, structured and unstructured data, big data and social media data becomes imperative organizations need to address their data quality issues head-on.

NOT FOR REPRINT

© Touchpoint Markets, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.