It’s generally accepted that the first written insurance agreement was established with the Hammurabi code. It took quite a while—over 3,000 years—before humankind could establish enough understanding of risk transfer and mathematics and analyze enough data to develop the first actuary tables in the late 1600s. Since then, the sophistication within every part of the risk-sharing process—from rate-making to underwriting to claims—has increased at a quickening pace, driven by greater precision and predictability and built on an ever-expanding amount of data.
Despite the industry’s developments associated with cross-company information sharing—bureau rates, claims databases, etc.—there has long been competitive advantages to be gained by the amount of data an insurer could collect in order to fine-tune rates and underwriting, support new products, and carve out new market niches. It was a model that favored companies with larger books of business generating more data and with more staff resources available for information collection to establish greater certainty around predictability.
ACUITY—a $2 billion regional property/casualty insurer doing business in 20 states—sees the data explosion as a force it can leverage to gain advantage against national insurers that have built much larger internal data stores.
“Today, we have access to a volume of data that we could never have compiled on our own,” says Ben Salzmann, ACUITY president and CEO.
Performing detailed, multivariate analysis in this information-rich environment requires a specialized combination of skills covering data management, statistical analysis, data mining and modeling, and strategy. Those skills need to be coupled with an understanding of both business processes and technology and with a natural curiosity to keep looking for relationships that can lead to predictability.
“Insurers may have the skill sets in analytics, but those skills are often localized in actuarial or departmentalized for underwriting or claims. The greatest challenge is extending those skills at an enterprise level,” says Masud.
In a data-rich and ultra-competitive market, leaders will be companies that not only best digest the amount of information available, but also deploy integrated analytic solutions at the front line. This can include providing decision support, such as at FCCI, or through full automation of the decision process.
“At one point, we had only captured 32 data points in our first notice of loss system. Today, we are able to capture several times that amount. New data fields can be easily added as needed, versus the difficulty of modifying the previous mainframe system,” says Dibble.
Second, the company has added several new claims data sources and brought those to the front line. “On a real-time basis when a call is made, information that’s public record—phone numbers for an individual, for instance—is fed into our system. We identify other phone numbers and other occupants of that household, which is very valuable in fraud detection. Those instances are alerted and scored. An alert doesn’t mean the claim is fraudulent, it means we need to look at it more carefully,” says Dibble.