If there is a horizontal technology application other than predictive analytics being adopted by more members of the financial services sector, then Ellen Carney doesn't know what it possibly could be. "This is it," says Carney, a senior analyst for Forrester.
The level of awareness of analytics in the insurance industry is surging, agrees Deb Smallwood, founder of Strategy Meets Action (SMA), but its full capabilities have yet to be realized. "Companies are experimenting with certain pieces, but the wave of usage and true benefits hasn't been seen yet," according to Smallwood.
From an operational standpoint, carriers can look at how a customer or product is performing and prioritize that particular segment, indicates Carney. "The possibilities are available to insurers, but much of them remain untapped," she says. "It's restricted only by innovation and imagination."
(For more on the relationship between predictive analytics and business intelligence, click here.)
In terms of assessing the value Pinnacol has received from the tool, Isakson reports his company's goal was to refine its business model.
"We'll be able to analyze the success of our strategy over the long term," he says.
But depending on how the carrier deploys the model and what the business users' specific objectives are, a carrier must define its goals clearly upfront in order to be able to measure success, he adds.
The good news for insurers, points out Smallwood, is among virtually all the case studies she's seen involving predictive analytics, carriers actually can pinpoint how the models have been successful.
"I just spoke with an insurance CIO, and [the company's] increase in terms of [policy] applications, its growth, its profitability, and its change in combined ratio all can be tied back to the pricing precision and straight-through processing [it achieved through analytics]," she says.
For carriers using analytics in the underwriting process, Smallwood suggests there is a process that needs to be followed. The first step involves taking the available data, cleaning it, validating it, and deploying it. "Once they deploy it, if they have purchased the right tools and put in the right processes, they can monitor [the models] and then refresh them," she says.
Many companies start small--such as an insurance credit score. "When they don't have the models in-house or the knowledge and experience [to use the models] in-house, they don't know how to monitor it," says Smallwood. "They have to hire consultants to come back and relook at the models, and they end up changing them on a yearly basis."
In some cases, the insurer starts to panic if it doesn't see immediate results, notes Smallwood. "A lot of times when predictive analytics is first used, your book actually may drop because you are screening business differently, and if you don't change the behavior of your agents sending you business, your book will decrease," she says.
The important point insurers have to understand, though, is the quality of their book of business will increase, which means profits ultimately will grow.
"When the book starts to drop, if [carriers] really don't understand the models, they start to freeze and use them less or they don't use straight-through processing and become paralyzed," Smallwood says. "It means [carriers] need to have some knowledge of the models in-house and the proper tools to monitor, recalibrate, and refresh."
Depending on the model, carriers are going to look at their loss ratio and how they segment their business and their strategy. "For us it was a book-wide application," says Isakson. "It affects the way we book our business."
Other carriers might target a specific segment of their book and over time see improvements in retention or profitability or both. "For us the aggregate picture doesn't look much different, and that wasn't our intention," Isakson says. "What we wanted to see was, underneath everything, are we doing a better job of recognizing our exposure to risks and where we are placing our accounts? We now are at the period when we should be able to see whether we have been successful in that."
It takes time to measure success on the underwriting side, though, concedes Isakson.
"For us it takes time to see results because you have to analyze your business problem and then put a model in place," he says. "And it takes time--particularly with workers' compensation--for claims to develop and mature. That's what we are looking for: What is the impact from a loss-ratio standpoint? That takes a bit of time to see some development and whether you are successful."
Neeraj Arora, director of analytics at Farmers Insurance, believes there are varying levels of "aha moments" that come with analytics.
Many times there is a gap between conventional wisdom and what actually goes on, he points out. "For example, an organization might be losing customers faster as it increases prices. The gut reaction is to blame the price increase, but analytics could show even the customers who did not get a price increase have high attrition levels similar to those who did.
"The real reason for increased attrition, therefore, would be something else. That's the first 'aha' analytics is able to provide--differentiating between what is perceived vs. what is really happening in the business," says Arora.
The second level of understanding, continues Arora, is the ability to differentiate between causal and co-relational impact.
"Often, decision-makers confuse the two," he says. "For example, customers who pay in full have higher retention compared with those on monthly payments. However, will a customer's retention improve if you encourage a monthly payer to move to paid-in-full status? The question is, does paying in full cause retention to improve, or do people who tend to stay longer also tend to pay in full? The latter implies that this situation is co-relational--they exist together because of certain other characteristics of the customers. When one is able to separate the impact of one thing over the other using analytics, what is left is sometimes very counter-intuitive and that's the 'aha' moment."
Smallwood indicates one of the primary areas where predictive analytics would benefit the insurance industry is in the area of market intelligence--aligning products and distribution channels with the right customers.
That encompasses in underwriting everything from risk appetite to risk analysis and pricing precision, and on the claims side, at a minimum, fraud detection. "What it does for underwriting and claims can allow for more automation," says Smallwood. "Not only can it help reduce expenses, but what it ultimately does is improve the overall quality of your book of business."
Arora trusts that as business users become more attuned to the success of analytics they are less questioning of its value, but early on in the process people might have healthy skepticism.
"When analytical teams use a lot of detailed statistical measures and graphs in their communications, it can distance the business group even further," he says. "A good way to get the business on board is to predict the results for the next few months, share them now with the business users, and then validate them once they materialize. That litmus test is far more powerful than any statistical proof. Of course, you cannot use this for long-term predictions or for things that take many months to realize."
KEYS TO SUCCESS
The temptation with any modeling is to show it off, but Arora cautions against being overly ambitious. "In any kind of predictive modeling you can try to get more lift by over-fitting using smaller and smaller segments," he says. "However, if the end goal can be reached by moderate segmentation, the models will be more stable and accurate."
The second key is to concentrate on goals. "You should be able to narrow your modeling world into what you truly need to target," says Arora. "It is much more difficult to have a model that predicts accurately throughout the population than a model that is more tailored to your objective. The analytics team needs to understand what will drive the decision and concentrate on improving the model at that tipping point."
The third key is to understand the data. "You need to be careful about things such as what source to use, what time period, and what kind of sample," says Arora. "For example, sometimes there are one-time events that occurred in the time period you selected and have a big impact on the outcome. If you fail to identify and normalize for these, the model predictions will be skewed.
"It's one thing to make a good model and another to make a good model that can be easily implemented," he continues. "A lot of times people make good models, but once they get to the implementation stage, they have to compromise, and [the models] lose a lot of power." Arora maintains the 80/20 rule applies to analytics, as well. "You could need 100 variables in a predictive model to get great results," he says. "You also could get 80 percent of the results by using just 10 variables. Though it finally depends on what you are trying to go after, more often the simpler model is the better choice. It is going to be a lot more stable, a lot less prone to variances, a lot easier to implement and validate, and it still is going to drive a whole lot of value."
Because analytics presents a different way of looking at data, a new level of expertise is needed within the enterprise. "Companies have data analysts, but the old way of reporting is always looking at the past data," says Smallwood. "What predictive analytics is looking at from the data is to try to predict outcomes. It's the next level of sophistication, and it really requires experience. Companies don't necessarily have the modeling tools to do what-if scenarios, tweak the analytics, or model them. They are looking at things through their traditional lenses. It really takes a skill set."
Many companies actually hire people from outside the insurance industry, Smallwood points out, and bring them into the department to plant the seed and grow that expertise in-house.
"What I've seen is insurers are going to the retail market, where marketing campaigns with predictive analytics have been done, and they bring someone from there into their own marketing department to help with that distribution channel," she says.
"The retail industry has been doing [analytics] with catalogs, online, stores, and advertisements. There are not a lot of people who know [predictive analytics] in insurance unfortunately," she says.
Arora is an example of an analytics professional whom Smallwood described. He began his work in analytics with Capital One before joining Farmers. "Insurance seems to be lagging behind in use of advanced analytics. If you think about it, insurers were the ones that started using the power of data earlier than anybody else. But other industries started at the next level while the insurance industry stayed with its 100-year-old methodologies," he says.
Smallwood adds you can buy that knowledge, but you still somehow have to transition it and have a training program.
However, Carney predicts, as carriers hone their skills and their ability to apply the data to their business, analytics is going to become even more valuable in terms of the insights it generates.
"I absolutely could see an analytics czar within the organization," says Carney. "It could be an interesting role. As people get more comfortable with the interpretation and application of [analytics], you could see specialties emerge that could be a wonderful career track for someone within the organization. I would suspect in terms of the paucity of skills around fraud, this could involve a high skill set for people."
When asked about which is easier to develop--good data or a sharp staff of analytics professionals--Arora replies he doesn't believe one can come to an easy conclusion.
"They are both challenging in their own right," he says. "Neither is an easy task. Most often you will see both need to be developed simultaneously. You need great analytical people to help acquire and maintain good data, and you need good data to engage the analytic team to do what it does best."
WAITING AND WATCHING
One roadblock on the underwriting side is models generally have a filing attached that has to be approved by regulatory agencies. "That's not something you are going to put in place and a month later see results and tweak again," says Isakson. "You have to have something in place and watch it mature over time with finite adjustments every couple of years."
But the long wait makes it imperative insurers get it right from the beginning. "For us, knowing it takes longer for things to develop and see results, we can't afford to miss and immediately turn around and refile," says Isakson.
"We spend a lot of time on the validation upfront before we deploy the model through our book of business and [rely on] historical results to make sure, for what we are deploying, we can predict how it will behave," he adds.
At Pinnacol, the models on the underwriting side are studied annually. "Unless your market is dynamic and changing dramatically, you probably are building something you want to monitor; but give it some time first to see the impact before you start tweaking it," recommends Isakson.
"You really need to allow it to flow through your book of business, so a window of two to three years probably would be ideal before making substantive changes unless you can't avoid making changes," he continues. "You may have to make refinements here and there along the way but probably not much. We haven't had to make any refinements yet because [the model] has behaved like we expected during the evaluation. On the results side, we are starting to look at that because we are two years in and should start to quantify the impact from an outcome standpoint. If you are constantly tweaking it, you don't have a predictive model."
Tweaking the models is not a constant process, but Carney believes it is a regular occurrence. "As your business changes and you identify new behavior or characteristics or the tool reveals something you don't expect, you have to react," she says.
GIVING DATA ITS DUE
The amount of data available is not an issue for Pinnacol, according to Isakson, nor is it an issue for most insurance companies. "Overall in the industry I believe we have a sufficient amount of data," he says. "I think that's especially true in workers' compensation, where it's a heavily regulated line of coverage and doesn't dramatically move year to year. So, the data is fairly well established."
A lot of insurers have addressed data issues through what they've done with policy administration systems over the last few years, remarks Carney. Now, they can fix other areas because they are more confident in the data quality throughout the enterprise. However, "if they've not gone through that exercise, the data has issues, and they are not going to be very confident of the outcome," she warns.
Size matters, too, Carney continues. "Smaller carriers are definitely looking at [analytics] and following the tier-one carriers," she says. "When you are competing with a State Farm or an Allstate, you have to use technology to help you overcome their size and skill. A lot [of smaller carriers] are looking at predictive analytics, and if they can have it delivered like SaaS, that can overcome some of the cost hurdles for smaller carriers or those that are focused by region or line."
The issue always seems to come back to the data, though, and Smallwood maintains until the business side really understands insurance is data-centric, it's going to remain an uphill challenge.
"I went to a company and interviewed 15 executives--top line and then to the next level down," Smallwood says. "Everyone on the business side whined about lack of responsiveness and lack of quality data, and when I looked at the list of projects in the queue, there were no data initiatives, no governance, and no stewardship. Sometimes people think if they put in a new policy administration system or a new quote system, it automatically will fix the data. You have to take a two-prong approach. You have to look at your company through the data and look at it through the process." TD