Virtual Viewpoint: Mining Mounds of Data

How effective are insurers in sorting through the mounds of information they possess to make insightful decisions?

Since insurers are intrinsically in an information business, itstands to reason that data collection and analysis is of paramount importance. Historically, insurers have struggled to trust decisions based off analysis of data from their legacy systems because of deep concerns about the quality of the data. And, as carriers bring online more and modern software systems, the data within them can accumulate at a mind-blowing, geometric rate. 

An enormous pile of information can be very intimidating, but carriers can find effective and efficient ways to take advantage of the valuable secrets lurking within.  Here are two key areas where certain, specific practices and technologies can markedly increase an insurer’s odds of data analysis success.

Capture

Systems must support data capture at appropriate levels of granularity, with tunable mechanisms, and in a suitably structured form. Granularity is a bit of an art form:  systems must capture information at the right level of detail to optimize specific operational business processes as well as support desired analyses.  Too much decomposition creates an unmanageable explosion of information (lots of heat, not enough light), but overgeneralized models will allow important discriminators or decision drivers to hide below the surface. 

Our experience is that carriers benefit from starting models that are well-normalized and can act as an industry “best-practice baseline,” but then data within each line of business, region, and channel usually deserves some refinement.  This means that a viable software system must provide a comprehensive ability to tune the models in order to provide capture flexibility, and enable an insurer to iteratively differentiate its offerings, pricing and services.

Finally, the data itself must be structured in a machine-interpretable way in order for software to do meaningful processing and to support downstream analysis.  Systems that provide models that are carefully elaborated encourage insurer staff, producers, and, in today’s era of self-service, customers to input source data in higher quality, structured ways versus manual “free form” content that is much harder to analyze later. 

Moreover, systems that provide for more advanced forms of structural encoding, like geospatial content and synthetic data types, create opportunities for carriers to stretch their representational thinking and achieve even higher capture fidelity. This emphasis on quality data in provides the essential foundation for quality decision insights out.

Leverage

The data within core transactional systems is the lifeblood of an insurance carrier, but the laws of physics impose performance limits on the kinds of calculations and analysis that can be done in situ in operational data stores.  Well-designed systems do provide a variety of in-line analysis and aggregated views into transactional data, but the most data-insightful carriers have grown entire ecosystems around their core systems, linking and feeding downstream analysis tools, frameworks, and repositories. 

For example, carriers often choose to feed a traditional data warehouse, but some will go further and use a transport architecture that also feeds certain content to a NoSQL store or a faceted text search engine for real-time queries.  The field is evolving so quickly that it is at least as important for an agile insurer to design for a dynamic set of data analysis components as it is to get any particular analytics path perfectly right.  And for that even to be possible, of course, it is incumbent upon carrier core systems to be flexible, configurable enablers of this brave new data world.

About the Author
Ben Brantley

Ben Brantley

Ben Brantley is Chief Technology Officer at Guidewire Software, Inc., a provider of core system software to Property/Casualty insurers. He can be reached at bbrantley@guidewire.com.

Comments

Resource Center

View All »

Leveraging BI for Improved Claims Performance and Results

If claims organizations do not avail themselves of the latest business intelligence (BI) tools, they...

Top 10 Legal Requirements for E-Signatures in Insurance

Want to make sure you’ve covered all your bases when adopting e-signatures? Learn how to...

Get $100 in leads with $0 down!

NetQuote's detailed, real-time leads have boosted sales for thousands of successful local agents across the...

The Growing Role of Excess & Surplus Lines in Today’s...

The excess and surplus market (E&S) provides coverage when standard insurance carriers cannot or will...

Increase Sales Conversion with this Complimentary White Paper

This whitepaper will share proven techniques - used by many of the industry's top producers...

D&O Policy Definitions: Don't Overlook These Critical Terms

Unlike other forms of insurance where standard policy language prevails, with D&O policies, even seemingly...

Environmental Risk: Lessons Learned from Willy Wonka and the Chocolate...

Whether it’s a chocolate factory or an industrial wastewater treatment facility, cleanup and impacts to...

More Data, Earlier: The Value of Incorporating Data and Analytics...

Incorporating more data earlier in claims lifecycles can help you reduce severity payments by 25%*...

How Many Of Your Clients Are At Risk Of Flood?

Every home is vulnerable to flooding. Learn four compelling reasons why discussing flood insurance with...

Gauging your Business Intelligence Analytics Capabilities and the Impact of...

Big Data, Data Lakes and Data Swamps, How to gauge your company's Big Data readiness....

PropertyCasualty360 Daily eNews

Get P&C insurance news to stay ahead of the competition in one concise format - FREE. Sign Up Now!

Advertisement. Closing in 15 seconds.