Becoming analytics-enabled is perhaps one of the most important evolutionary steps forward an organization can make. The rewards of data-driven decision-making can be a powerful boost to the bottom line. For insurance companies, this may include using underwriting predictive models to increase profitability through more granular pricing, driving a six to eight-point reduction in loss ratios. On the claims side, predictive models have helped insurers better segment and triage high severity workers’ compensation and bodily injury claims, driving a four to 10-point reduction in claims spend.
An important part of the analytics journey is overcoming the numerous challenges an organization encounters when experiencing the end-to-end development and deployment of predictive models. Model development (e.g., data assessment, data acquisition, data cleansing), scoring engine development (e.g., scoring engine and database design, development, testing, deployment), and business implementation (e.g., strategy formation, change management, tools for measuring business) are some of the common questions organizations should consider.
Some of the concerns the authors have observed in the development and implementation of advanced analytics include:
1. Executive Ownership
Without buy-in from senior leadership and a clear corporate strategy for integrating predictive models, advanced analytics efforts can end up stalled at model development. In order to be effective, analytics efforts should involve the key executives who can help drive acceptance and change throughout the organization. Senior leaders should insist there be a clear correlation between the actions to be taken through model implementation and the expected business benefits to be realized. Without accountability for a targeted return on investment, organizations risk spending a lot of time “doing” versus “getting things done.”
2. IT Involvement
Failure to involve IT from the very beginning of the analytics journey can lead to significant issues down the road if technology gaps and limitations aren't understood up front. Modelers may find a way to get access to internal and external data, but without the help and involvement of IT, it is almost impossible to bring the models to life in the day-to-day operation of the organization
3. Available Production Data vs. Cleansed Modeling Data
Access to historical data for model development is very different from access to real-time data in production, and a strong model is only as good as its ability to be practically implemented within the technology infrastructure. Real life limitations may restrict the data that's available for historical modeling. Sometimes a proxy variable can be used for modeling until the data is available. Analytics initiatives often risk being stymied by the belief that data for modeling must be perfectly clean and organized. Predictive model development is not an accounting exercise, but rather a statistical process where numerous techniques allow the “dirt in data to be washed away.”
4. Project Management Office (PMO)
Lack of clear ownership of the end-to-end journey is a common stumbling block for organizations that have struggled (and failed) in implementing their predictive models. Without the right project management structure in place, a clear cadence of project milestones, and the ownership of deliverables by pre-identified business owners, the project could be doomed before it starts. Most importantly, the PMO must be able to connect with all interested parties and adopt an agile approach.
5. End User Involvement and Buy-In
Lack of end user involvement in the planning, design and ultimate roll out of the predictive models can be detrimental to the efforts. For underwriting or claims models, involving underwriters, marketing, actuaries, claims adjusters, nurse case managers and special investigative unit (SIU) resources early in the process is critical. End users also have more insight into the business process and may be able to better identify potential gaps or roadblocks to successfully incorporate models in day-to-day operations. If the end users feel as if they have a stake in the predictive model roll out, then the company may be more likely to realize the potential financial benefits. If done correctly, some of the early doubters can eventually become analytics advocates.
6. Change Management
Organizations often fail to understand how predictive models change the current business and technology operations — policies, procedures, standards, management metrics, compliance guidelines and the like. Without the proper design, development and roll out of training materials to address the impacted audiences in the field and home office, the analytics journey can come to an abrupt end. Educating end users and other related stakeholders on how the model will be used on a day-to-day basis, and how their life may change, is important. A communication plan should be developed to answer frequently asked questions (FAQs), address common concerns, and help end users appreciate the strategic vision of the organization. Change management doesn't start and end with training; it begins on day one and lasts well beyond the roll out of the models.
7. Explainability vs. the “Perfect Lift”
It is important to balance building a precise statistical model with the ability to explain the model and how it produces results. What good is using a non-linear model or complicated machine learning method if the end user has no way to translate the drivers of the score and reason codes into actionable business results? Experience shows that a less complex statistical model development method yields results similar to those from more complex approaches, and a small sacrifice of predictive power can result in marked improvement in the explainability of technical model recommendations for end users.
Insurance Company Size Matters
Every company is different, and designing a successful approach to implementing predictive models within the business process can vary widely. What works for a large national insurance carrier may not work for a regional mutual insurance company and vice versa. Some differences include:
1. Communications and Training
When it comes to change management, larger companies may struggle with executing communications and training plans across an often siloed, functionally diverse and geographically spread organization. A major challenge involves providing tailored communication and training to address business processes for multiple stakeholders with different expectations and office cultures. Larger companies may have to balance corporate communications protocols while smaller companies may not have the same challenges.
Small companies may underestimate the effort involved in change management and may discount the importance of multiple communications, gaining stakeholder buy-in, and conducting effective training. There is also the possibility of being stretched too thin to provide the adequate level of dedication and focus, since they are involved with other aspects of the project in addition to their day-to-day responsibilities.
2. Competing Initiatives and System Challenges
Large companies may also face additional challenges when working across new and legacy systems. Involving technical and business subject matter experts (SMEs) in the model-build and implementation phases is extremely important to get the full view of the current and target state across each impacted business process. Larger companies may also find themselves juggling multiple initiatives at once (e.g., rolling out a new claims management or policy administration system along with predictive models), which can be a challenge for resource allocation. Another consideration is how predictive analytics and other initiatives will affect one another. When related projects are underway, a multi-phased analytics future state, starting with a semi-integrated predictive analytics solution, is a common approach to work around initiatives that may be affected by predictive analytics.
3. Resource Constraints
Large companies may also face challenges getting the right resources, and especially the decision-makers, involved in the process. During the model-build phase, it's typically easy to identify who needs to be involved (e.g., analytics team, actuaries, data SMEs). Making decisions on how to implement can be more complicated. These decisions need to involve business unit leads and, in some cases, upper management — busy individuals who may not have the dedicated amount of time needed for implementation discussions and decisions.
While it may be easier for smaller companies to get the business unit leads and even executives involved in a predictive analytics project, they may suffer from having too many people involved, since project leaders may not want to leave anyone out of the decisions.
4. End User Buy-In and Model Use
Getting the most benefit from predictive analytics is a challenge generally faced by all companies, but it may be more complicated in smaller companies due to sensitivity around expectations for using the model. While some companies tend to think of a predictive model as an additional tool to help make better decisions, the degree to which employees are expected to follow the model can vary significantly.
At smaller companies, employees may worry about their roles being replaced by predictive analytics. Models are not typically intended as replacements for people, but rather as helpful tools for making more informed, objective, metrics-based decisions. This balanced approach and intent should be continuously communicated to all project participants as an essential foundational philosophy for organizational buy-in and effective change management.
As Larry Winget said in It's Called Work for A Reason, “Knowledge is not power; the implementation of knowledge is power.” Successfully building and deploying predictive analytics goes far beyond the development of a highly predictive model and a robust scoring engine technology. It is a holistic endeavor that requires a focused effort on practical implementation within the business operation as well as organizational, people and process considerations for what is most effective for end users and the broader corporate culture.