Consider this: Gartner predicted that in 2017, 60% of big dataprojects would fail. Staggering, in light of the level ofinvestment being made in the technology to take advantage of thedata organizations are generating.

|

Related: Fully leveraging data analytics ininsurance

|

The concept of big data is not new in the insurance industry. Ifleveraged correctly, big data can transform the wayinsurers do business by allowing them to continually enrichcustomer experience, streamline operational costs and increaseprofitability. Given the intuitively obvious benefits, why is themomentum behind the benefits of Big Data still lagging?

|

Here are three of the main challenges.

|

Related: How carriers can leverage the power of bigdata

|

No. 3: Lack of clear technologydirection

Often, in a large organization, there are multiple technology initiatives and, in somecases, multiple big data projects. Sometimes, these effortscontinue in parallel without sufficient communication between theseparate teams. This can obviously cause unnecessary overlaps andconflicts that minimize or, in the worst case, neutralize theoverall potential benefits of such efforts to the organization. Inother cases, there is an effort to unify the different efforts intoone, strategic effort. This approach ends up consuming time inbuilding consensus and could end up moving too slow to reap timelybenefits.

|

Given the flexibility, scalability and adaptability oftechnologies today, it may be okay and necessary in severalinstances to continue separate efforts. In a conceptual sense, thisis no different than separate core systems (policy, CRM, claims)existing within an organization. However, there must be focus onclearly defining and designing the interfaces between the disparateefforts such that data between any two environments can be easily,reliably and speedily exchanged and understood.

|

This is often easier to address than attempting to unifymultiple technology efforts into one and should be a keyconsideration in the early stages of each effort. However, this iscommonly overlooked during the design and deployment of individualefforts and is too expensive/late to address later. There are 3 keyaspects of ensuring communication and understanding across multipleefforts:

  • |
    • |
      • |
        • The technology framework to accomplish this;
        • Establishing a consistent approach to ensuring data quality andintegrity as data flows between systems; and
        • Establishing a common, unified understanding of the data suchthat the technical data is mapped to business terms and functionsin a way that it is consistent and comprehensible across theoverall organization.

Related: 9 ways insurers can drive more value fromtechnology investments

|

No. 2: Unclear mapping ofbusiness objectives to specific criteria

Another common challenge is that a big data effort is frequentlyinitiated with a limited understanding or assessment of how to mapthe overall objectives to the specific business requirements and metricsthat would clearly define the success of such an effort.

|

For example, a business case objective may be to reduce churn bya certain percentage. Now, the focus is usually on the set oftechnology/tools to put in place, the data to ingest and theanalytics to create to predict/identify and analyze churn patterns.Sounds right? Yes, but there are a couple of elements that areoverlooked.

|

What is the requirement for the level of quality of the data?Even if the technology is spot on, the accuracy of the analyticswill be questionable if the data quality and integrity requirementsare not thought through. Here is a simple example: for any churnanalysis, the approval/activation date of a policy is an essentialattribute. What if, due to circumstances outside the control ofthe big data effort, that attribute is not reliablypopulated in a source system? That puts into doubt theeffectiveness of the analysis.

|

Is there a common framework of understanding of what the datafrom the different sources mean from a business standpoint and whoowns what parts of the data? For example, if theapproval/activation date of a policy is available from the policysystem and the CRM system, which of them is the correct owner? Isthe definition/understanding of this attribute common across theorganization?

|

Related: Why artificial intelligence could become a P&Cgame-changer

|

The typical scenario in a big data effort is that theteam consists of technologists and data scientists who end uphaving to address the data quality/integrity and governance issuesbefore any useful results can be produced. This unplanned effortnot only derails big data initiatives but also demotivatesthe team because they have to jump deep into areas that are nottheir core expertise or interest.

|

A common challenge is that a big data effort is frequently initiated with a limited understanding or assessment strategy. (Photo: iStock)

|

A common challenge is that a big data effort is frequentlyinitiated with a limited understanding or assessment strategy.(Photo: iStock)

|

No. 1:  Lack ofeasy, timely access to the results

Typically, more attention is given to the technology to enableingestion and storage of big data than to thespecific requirements of access by business users. The thinking isthat the bigger challenge is to get the data sourced from differentsources and then to store that data in a suitable big dataplatform. Once that is done, the big data platform willenable access by business users.

|

This is true. However, will the data be available in a timelymanner to each group of business users? Will the business user beable to "slice-and-dice" the data without having to rely ontechnologists? Will the business user have a set of relevant"out-of-the-box" analytics that they can leverage or will they haveto build? Will the analytics be operationalized to workwithin current processes or business practices or will the userhave to change the way they work to leverage the analytics?

|

Many of these questions are left unanswered until late in thegame. This unfortunately means that business users are left with aplatform that has tremendous potential that they are not able toexploit.

|

Related: From FNOL to settlement — datamatters

|

So, what should be done?

In the challenges described above, there is a common theme:beyond the technical aspects of a big data initiative,unified understanding of the data, its quality/integrity and itsaccessibility for analysis are essential to the success of suchefforts.

|

Data quality: It is crucial to includein the planning for such efforts, the clear definition of the data quality andintegrity requirements.

|

The data quality assessment should incorporate determining thequality of the data from the source as well the quality of datathat is exchanged between systems, including an approach tovalidate data quality on an ongoing, automated basis. While datawithin a source may be of good quality at the moment, that maychange as the source environment changes over time and as data ismoved from this source to other systems. With a standardized,auditable and automated end-to-end data quality framework,insurers can remain vigilant and catch errors before they impactbusiness

|

Data governance: It is crucial that adata governance approach is part of any big datainitiative.

|

Data governance capabilities can bridge the business andtechnical divide by delivering transparency into all aspects of aninsurer's data assets, from the data available, its owner/steward,lineage and usage, to its associated definitions, synonyms andbusiness attributes. Full transparency allows business decisionmakers to gain valuable insights into not only the details of theirdata assets, but the attendant risks associated with its use acrossbusiness applications.

|

Accessibility to data and results: Thedata and any deployed analytics must be accessible in a relevant,flexible and timely manner to business users.

|

Finding an analytics "solution" is far more important thanfinding an analytics "tool." While tools generally come with agreat amount of flexibility and capability, it is up to thebusiness user to either rely on technologists to produce usefulinformation or to spend considerable effort and time to educatethemselves to exploit all the capabilities of the tool. Neitherapproach is scalable. It is important to select a solutionthat:

  • |
    • |
      • |
        • Seamlessly operationalizes or integrates the analytics intocurrent processes or work flows;
        • Comes with a set of "out-of-the-box" analytics and results thatcan be leveraged and distributed by business users in an intuitivemanner from day-one, in addition to other powerful capabilitiesthat business uses can learn over time; and
        • Provides the data in a timely manner to business users ratherthan leaving it to the users to figure out how best to extract thedata from the platform in a timely manner.

With the right combination of analytics, data quality and datagovernance, insurers can deliver on the success of bigdata initiatives.

|

Ravi Rao is a senior vice president of pre-sales at Infogixa pioneer of automated dataquality, data governance and advanced analyticssolutions. He can be reached by calling (630)505-1800.

|

See also:

|

Machine learning for UBI: An optimal path toinsurance ratemaking

|

Execs look to artificial intelligence (AI) forinsurance revolution

|

The AI paradox: What does it mean for insurers?

Want to continue reading?
Become a Free PropertyCasualty360 Digital Reader

  • All PropertyCasualty360.com news coverage, best practices, and in-depth analysis.
  • Educational webcasts, resources from industry leaders, and informative newsletters.
  • Other award-winning websites including BenefitsPRO.com and ThinkAdvisor.com.
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.