Recent weather-related events in the United States have shown how unready we generally are for crisis situations. When a crisis does occur, people turn to insurers, so it remains imperative for insurers to follow that old Boy Scout motto: Be prepared. The focus in recent years for many property/casualty carriers has been to examine risk exposure–particularly in storm- and earthquake-sensitive states. But those truly prepared for crises are focused on all levels of catastrophe-related risk within the enterprise.
“A lot of insurers are doing the catastrophe risk management, and several companies probably have had consulting firms do an evaluation study, but the difference is it takes a lot longer to build the [risk management] capability internally,” says Don Mango, director of research and development for GE Insurance Solutions (GEIS). “It's not just a software challenge. It's a training challenge; it's educating your management; it's getting people in the product-line areas to believe in the input; it's getting connections between planning, reserving, underwriting, and pricing, and they all have to be connected, and all that information ends up getting stored in these internal risk models.”
The organizational challenges are more difficult than the technology issues. “The software is well past what we need,” asserts Mango. “It's actually more sophisticated than we are capable of handling right now, honestly. The problems we're having all are related to organization. How do you get a company ready to use this in all of its decision-making?”
TowerGroup is seeing an emerging focus on the broader utilization of predictive analytics in insurance, according to Mark Gorman, strategic research adviser. “What these predictive analytics tend to do is identify key characteristics that contribute to the risk where those characteristics may not be priced adequately in the current model,” he says. This is used to segment the individual risks better, create more segments in the marketplace, and apply more pricing levels to those risks.
“[Carriers] are looking for differences in the data that will lead them to make either a more informed decision or a slightly different decision on the data than they would under normal circumstances just by applying rules,” Gorman indicates. “The advantage of the predictive analytics is it allows organizations to evaluate not just individual characteristics but the interplay of the characteristics with each other to give a much more informed decision about the relative risk of any individual risks.”
Risk Modeling
Greater New York Insurance is using its risk-modeling software to examine its cumulative exposure in certain areas. Kathleen Hurley, assistant vice president of IT, explains the MapInfo software it uses can be employed in a variety of ways. One is to enter a specific address to determine how many policies the carrier has written in that certain area. “For example, we can put in the Empire State Building [address], and we want to see how much risk we have within 1,000 feet or 2,000 feet of the building,” she says. The carrier's analysts study results in two ways. One is by product, such as determining how much worker's comp exposure the carrier has in that area. They also can establish the carrier's premium exposure issues such as the number of employees on a workers' comp policy within that same radius. “We can do it by product and by exposure ranges,” notes Hurley. If someone requests a policy quote, Greater New York can enter an address and determine whether there is a concentrated risk in that area.
The carrier updates its data for the mapping system weekly. “We use a back-end policy and claims system and download all of our in-force policies and all the location information and upload it into MapInfo and do a refresh of the data,” Hurley says, adding it is simple to integrate the data. “[MapInfo] developed this mapping front end for us, and it basically uses an access back end. We take our data from our AS400, pull out the information we need, upload it, and take that file and run it through a program MapInfo gave us called a geo coder. What it does is take every location and give it latitude and longitude, and that way [MapInfo] can plot it on the map correctly.”
CAT modeling gives companies a robust view of their potential exposures, according to Pat Teufel, a principal with the consultancy KPMG. High windstorm and earthquake areas are identified, and the insurance carrier's exposure in those areas is tracked effectively. “There typically are underwriting rules for what kinds of construction and what kinds of areas are most prone [to problems] and a recognition of what your existing exposure is in the area,” she says.
Carriers are finding new data that will prove to be instructive for risk evaluation, as well, Gorman believes. “Once you capture the data, we're seeing an increased utilization of data mining and business intelligence tools,” he says. Carriers also are running the models in real time as they receive the data. “As a stream of application data comes in, can they take this scorecard they've built, run that data through, and get the score on that risk at that point in time to make a risk decision?” he asks. “Clearly the technology is there now in the marketplace to provide that ability.”
TowerGroup contends upward of 85 percent of all the risks in the personal lines area actually can be decided using scorecards and models with the remaining 15 percent still referred to human beings. “That's a significant straight-through-processing increase over what we saw in a manual or rules-based environment,” says Gorman. “In the commercial side, it's not quite that high, but it's still better than the rules based only.”
History Lessons
Carriers can score a series of risks and rank them–from the riskiest to the least risk–and then use the scoring methodology to set cutoffs for pricing. “We do see on the underwriting side for P&C insurance the risk scores are being used in conjunction with the business rules in order to apply the segmentation,” says Gorman. “We're clearly seeing an increased utilization of external data sources in risk evaluation. We're also seeing an increased utilization of derived characteristics–characteristics where the inherent data either is combined or acted upon to create a derived characteristic that can be used for analysis.”
Insurers have used outside data sources for years, including loss history, motor vehicle records, or credit information. But new data sources are emerging. “In commercial lines, we're seeing an increased utilization of external data available in the marketplace people are accessing through the Freedom of Information Act,” Gorman points out. “In personal lines, there are increased data sources, some of them around motor vehicle record information, some of them around geo-spatial positioning information. There also are some nontraditional forms.”
Regulatory Challenges
GE Insurance Solutions had a sense of unease that the externally mandated required-capital formulas given the carrier by the rating agencies were not reflecting the risk of the carrier's own business, Mango claims. The carrier also was receiving some external pressure from some of the more advanced regulators, such as the United Kingdom. “There were pending regulatory changes in the UK that were going to require us to have an internal risk model,” Mango says. Those changes did come to pass in 2004, he adds. The combination of those two factors led GEIS to improve its modeling capability.
The regulatory changes in the UK called for an explicit risk management function within the company, reports Mango. The mandates, he says, included risk committees, documented evidence the decision-makers of the company actively were participating in risk management and it was the driving force behind the company, and the development and use of an internal risk model to make capital decisions.
Fitting Pieces
GE Insurance Solutions used a combination of several technology pieces to achieve its model. “The first is vendor software,” says Mango, which typically is graphically driven simulation engines. “It's sort of like a mapping tool because the configuration of the model is going to be specific to the company, so you need a flexible user interface,” he says. “There are a lot of complex relationships that need to be modeled.”
The second piece is fairly standard Microsoft Excel templates for storing a large amount of input data. “The input data has to be validated and linked to the planning systems,” explains Mango. “It has to balance.”
The third piece involves the post-processing work. “The vendor models can produce a lot of output–hundreds of megabytes if you want the full-blown details–but that's difficult to work with because the files get so big,” Mango continues. “We use MATLAB [software] to do a lot of our post-processing, and our application templates that use the output are Excel for the front end to MATLAB for the heavy computation.”
One of the biggest challenges involved learning the vendor software models, Mango believes. “It's complicated, and if you don't do it right, it will waste a lot of computational resource for no benefit,” he says. “[Solutions] are difficult to validate and test because they are so complex. They typically have a set of preconstructed functional blocks, and you assemble these blocks together in the configuration you want. It's an application software, not a development environment. There's not a lot of custom coding, but the problem is you are assembling very complicated models, and it is difficult at times to make sure you have assembled them correctly.”
Another challenge, Mango adds, was the data size used for validation testing. “You are talking about really big files, so big they almost grind Excel to a halt,” he says. “The files are like 20 to 30 megabytes. We don't have to have an integrated system now because it's primarily a small group of expert users within R&D, so we don't have to worry about having this ironclad start-to-finish integrated system yet, but we still have major technical challenges along the way.”
As for managing the size problem, Mango indicates his unit will insulate the users from the vendor software portion of the solution and determine through a combination of Cognos Cubes how to let users browse and work with the output data. This will allow the users to get a feeling for the quality of the input and custom-built templates. “Our approach is to filter the data as much as we can and put it in a self-contained, easy-to-use template and let the larger group that is interested in our work play with it that way,” he says.
On the Lookout
Terrorism models also are available today, but there have been far fewer events with which to track the model estimates vs. the actual results than when compared with storms and earthquakes, where there are hundreds of years of scientific data available. “The terrorism models are built in much the same way as the catastrophe models–using historical experience,” says Teufel. “The terrorism models have been built, at least initially, on panel discussions of what a potential event might look like. Some of the models don't assign a probability to an event of that kind. They concentrate instead on if an event of this type occurs, what is the damage that could result? We have more of a track record [with storms and earthquakes].”
Greater New York's use of modeling came as a result of 9/11. “We're located in midtown [Manhattan], and we saw what happened south of us with some of the other insurance companies,” says Hurley. “We wanted to keep an eye on what we were doing. There were some people who wrote only in [one specific] area, and that could be tragic, so we really wanted to take a look at where our exposures were. If the Empire State Building, God forbid, was the scene of a terrorist attack, what kind of exposure would we have?”
Mango states the internal risk-modeling products all come with a pre-defined capability to handle the output from the earthquake and hurricane models. “Every insurer and reinsurer does hurricane and earthquake modeling now, so that means the internal risk-modeling software people have must be able to include that output with almost no effort required by the user,” he says. “The real challenge now is to do the same level of quality in modeling we do for hurricane and earthquake for all the other insurance lines. They're starting on terrorism and worker's comp exposure to earthquake, but we still have a long way to go.”
Adds Teufel: “In much the same way as with weather-related catastrophe models, any time you have an event, the post-event analysis is particularly relevant in refining the model.”
Success or Failure
Teufel does not believe the models used for the Gulf Coast prior to Hurricane Katrina predicted the level of damages that actually occurred. “Frankly, that's as much a scope-of-the-models issue as it is model performance,” she says. “The models for the most part were intended to look at wind exposures and earthquake exposures. A major deficiency relative to Katrina is the whole notion of flood damage.” Katrina brought extensive flooding, and questions have arisen as to how that will impact claims, particularly how much of a claim is flood damage vs. wind damage. “Certainly, in any CAT modeling, the post-event analysis provides an opportunity to refine the models and possibly to extend the scope of what the models do,” she comments. “Flood is a prime example. Frankly, the models were intended to evaluate property risks only. They didn't contemplate the extensive damage that occurred.”
There are other ways to use the software, suggests Mango. “We think it's for planning more outbound reinsurance purchasing, and we think we can use it to benchmark pricing,” he says. “The real dream is to use this for required-capital setting, and that's where the UK is heading. The UK regulations basically say if you have a real good risk management function in your company and you have an internal risk model, you can use the results from that model to set your required capital.” This could mean either a lower required-capital number for a certain portfolio a carrier wants to write or room to write more business given your amount of capital, Mango points out. “That's where the banking world is, but the insurance world is probably 10 years behind,” he notes. “However, that's where we're heading.”
© Arc, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.