Like many parts of the insurance industry, reserving is an evolving science. To maintain a clear picture of what a carrier must set aside to keep regulators happy–especially in today's volatile financial and regulatory climate–actuaries rely on tools such as enterprise risk management to assist them in mastering all relevant variables.
The role of an actuary is to protect an insurer's financial solvency and establish appropriate reserve and capital levels. Finding the right numbers never has been easier for actuaries–or harder. Technology has offered computing power actuaries have not seen before, but as the data piles up and new models are explored, actuaries are being asked to provide more than one answer to a single problem.
To determine the best reserving levels for carriers, actuaries have to know what everyone in the enterprise is doing. The collaboration needed today is best found in an enterprise risk management view, which many carriers have practiced for years without benefit of the three-letter acronym ERM.
David Brentlinger, chief actuary with OneAmerica Financial Partners, considers ERM one of the focal points for rating agencies and for a company's board of directors. “It allows us to get a better handle around things,” he says.
Actuaries spend most of their time now on interest rate risks and equity risks, points out Brentlinger, and have added areas such as credit risk, liquidity risk, operation risk, and compliance risk, particularly after the events of the last 12 months.
Mike McLaughlin, principal and global leader of actuarial and insurance solutions for Deloitte Consulting, believes reserving tools that increasingly will come into play for companies are centered around ERM.
“The biggest [carriers] tend to have more resources and more powerful models,” he says. “Some years ago, dynamic solvency testing and capital adequacy were big trends among property/casualty companies but never fully took hold because of the limited predictive capabilities. Nonetheless, we see those approaches coming back into more common use–a sophisticated model giving a range of outcomes to help [actuaries] arrive at how much capital is needed to support various lines of business.”
McLaughlin also sees stress testing–taking a look at the kinds of scenarios that might arise and what they do to operational risk–as an additional technique for actuaries. “They look at a combination of business circumstances, such as if there was a pandemic or a tornado hitting the offices or what if the employees can't come to work for whatever cause,” McLaughlin says. “How does a company continue to operate? What is the impact on revenues, expenses, and investments? I wouldn't say all companies do this, but the leading-thinking companies are looking at stress tests as a way to evaluate the riskiness of their business and how much capital to allocate.”
BOOTSTRAPPING
A popular technique Dave Otto, managing director for North America for the actuarial and consulting firm EMB, sees being used by insurers is known as bootstrapping. “What's nice about [bootstrapping] is it produces a range of outcomes,” he says. “It tells you not only what your point estimate should be but also the likelihood of other types of outcomes occurring.”
Bootstrapping allows for the variability in cash flows, notes Otto. “When we talk about reserves and say a company needs $100 million in reserves to pay out all the remaining claims,” he explains, “that's going to take a lot of time for that to occur. Is $100 million right, or is $50 million right? What's the range around that $100 million? So, in 30 years, after all the claims are closed, that is the exact amount of reserves that was needed. What's also important is how those reserves are going to be paid out over the next 30 years. If I say $100 million is needed for the remaining claims, how much am I going to need next year and the year after?”
Bootstrapping not only gives insurers the point of estimate of what their cash flows will be every single year but also gives them variability around that figure. “A lot of methodologies actuaries are using right now give a range around the point estimate–where things ultimately will be,” says Otto. “Bootstrapping gives you the range around all the cash flows that make up that estimate over time.”
MODELING TOOLS
While the IT department at OneAmerica provides the infrastructure, the tool set, and the expertise around the product selection and implications of particular directions, the actual running of the predictive models and their configuration is done by the actuarial community, according to Kevin Weston, vice president of IT planning for OneAmerica.
“As the expectations have been raised from a SOX or MAR [Model Audit Rule] perspective about discipline and controls, particularly related to change management, we have been engaged in conversations with the actuarial community about how we can leverage and adapt some of the processes for IT,” says Weston.
The models developed for OneAmerica are used mostly in the carrier's GAAP accounting where the carrier has to use its own estimates for future policy behavior, future investment returns, and future equity returns, according to Brentlinger.
Statutory accounting also is evolving toward more predictive models with principle-based reserves, adds Brentlinger. On the capital side, the OneAmerica actuaries calculate capital for annuity products, both fixed and variable, based on the carrier's own experience.
“The whole move to principle-based approaches is coming up with models that are based on our products, our assumptions, our underwriting standards, our investment policies and strategies–and not just using an average that is good for the industry but isn't tailored for our use,” says Brentlinger. “Using those models, particularly on statutory, capital, and even some of our GAAP reserves, is where we are running [analysis] based on a thousand different scenarios. When we run those models, we get a distribution of future surplus needs or future income amounts, and the distribution lets senior management assess the risk and the return we are getting from that product.”
In the old days, Brentlinger adds, a carrier would price a product and come up with an average return. “Today, actuaries and management are looking more at the tail of the distribution–the worst two percent of the scenarios–and looking to see what kind of losses can come from those tail events,” he says.
Most of the mathematical theories insurance carriers are using today with predictive modeling existed back in the 1970s, explains Brentlinger. “Technology finally has caught up, so we can put those theories into practice,” he says.
SEE THE FUTURE?
Predictive modeling is becoming an increasingly powerful and popular tool, McLaughlin believes, but there are other sophisticated techniques used directly in the reserving process to make the best use of limited data. “Your data is not able to predict the future,” he says. “You can make forecasts or estimates, but you can't be certain of the future.”
Actuaries have a large number of sophisticated mathematical models that will look at past patterns and attempt to estimate future losses, but McLaughlin hesitates to call that predictive modeling. “With predictive modeling, there is a lot of work being done to analyze claims when they have been incurred and look at the more sophisticated ways of managing those claims,” he says. “You would think that would have an effect on the reserving process, but I would say there is not a direct connection at this point in time.”
Otto contends analytics allows insurers to get a better point estimate of where the carrier thinks its reserves are going to land. “Predictive modeling is a good tool for dealing with a type of business that is going through a lot of changes. [But with} most actuary methodologies--when the traditional type of reserving methodologies are applied--the one thing those methodologies have difficulty dealing with is change," says Otto.
Most traditional methodologies assume
consistency to the past, he explains. When an insurer goes through change, it is often because there are new claims-handling people or a new tort-reform measure affects the way cases are adjudicated through the court system. "When there is any kind of change like that, the traditional methodologies will have difficulties," says Otto. "You still can use them, but companies have gone to predictive modeling to try to deal with those particular lines of business or the things within their company that are causing a problem and try to get a better estimate."
Stochastic reserving methodologies, which are used to come up with a range of estimates as opposed to a certain point, also are receiving a great deal of attention from insurers on the property/casualty side, indicates Otto. "Most actuarial methodologies come up with a single number, and what companies are moving to are methodologies that not only come up with a point estimate but a range about that point," he says.
REGULATORY COMPLIANCE
Regulators are struggling with how to deal with the models for establishing reserves, according to Brentlinger. "In the old system for a reserve, you might have base amount times a factor, and that equals the reserve," he says. "Now, you are running a model over a thousand different standards to calculate a reserve. It's just a more robust system."
The problem for regulators is they have to assess how good a company's assumptions are and how well backed [the assumptions] are by experience. “In modeling, though, there is an engine that does calculations, assumptions go into that model, the business goes in, and out comes your projections,” says Brentlinger. “It's just a different way they have to regulate.”
Rating agencies also have to make an assessment of how strong and robust the models are, adds Brentlinger. “Having the controls around the model is a big industry issue right now and one we are addressing,” he says. “Even as chief actuary, when I get the results of a stochastically generated model based on a thousand scenarios and try to compare that with last year's results, I need to see how I can do an attribution analysis so I can walk through the change in those values. It's much more complex when you are dealing with models based on a thousand scenarios vs. an old factor-based model.”
Brentlinger relates he has spoken with regulators from one state with a large insurance department who have admitted they are struggling with the changes. “How can they sign off and say an actuarial model that's run over a thousand scenarios is a good model?” he asks. “You can't totally rely on the company because there are a lot of different assumptions going into the models. It's an issue with the regulators and also an issue with our internal audit department. We are trying to train [auditors] so they can do their job and sign off on our models. There is going to be more of a focus on better documentation, both on the assumption side and the process side, to make the whole process SOX-like.”
COLLABORATIVE EFFORT
Many of the actuarial models Brentlinger's company uses are for pricing and risk management, he notes. “We'll take these models and project out the income statement and the balance sheet over the next 20 years with a thousand to ten thousand different equity scenarios,” he says. “That's what adds to the computation time. Those [projections] can take days or even weeks.”
The path to success, advises Brentlinger, is for actuarial departments to work with IT departments early in the process. “We've learned to partner with them as soon as possible and get them as part of our working team,” he says.
Weston maintains collaboration is a high point of his department's working relationship with Brentlinger's actuarial staff. “You have bright and technologically adept people in actuarial, and in the past, it's been reasonable they could do the configuration running the models, but as the demands have increased, it has exceeded the ability of any particular group to come up with the right solution for the organization,” he says. “Perhaps through necessity, we've developed a good relationship to where we sit down at the table early on. Situations exist where we need to react and respond as quickly as we can, but I think we've made the most progress in having a relationship where we talk about the upcoming needs and come up with a solution that best maps to it.”
It also should be the objective of every actuary to work closely with the underwriting and claims people, points out Ron Swanstrom, senior consultant with EMB. “We can look at the numbers all we want, but if we don't understand what's going on with the business units, [the numbers] are really not worth anything,” he says.
Collaboration within the business units of insurance companies was not good in the past, asserts McLaughlin. “The actuaries or the underwriting department or the finance function in many cases would work separately,” he says. “The actuaries felt they had enough data on losses to come up with the loss reserves, so why would they need to talk with anyone else about that? Likewise the underwriters would say they can look at experience as it emerges and price new business adequately. The finance people would say they know what the assets were worth, so why would they talk to anyone else about that? The silos were strong in the past.”
It wasn't until the science of enterprise risk management became more widely applied that the silos began to disappear, continues McLaughlin. “It became clear risks in one part of the operation of an insurance company or any enterprise can affect other parts of the operation and the whole organization in unpredictable ways,” he says. “You can be affected by losses in different ways. With investments, some of the big names that failed [in 2008] really affected all companies. The greater realization is collaboration is vitally necessary.”
Collaboration is best accomplished where there is a culture of risk awareness, emphasizes McLaughlin. “I would say that is becoming very common across the insurance industry where senior leadership–and this goes even to the board level–is aware of that need for collaboration and a holistic look at operations,” he says. “The practice varies. Smaller companies may not be as far along. I spoke with one company recently where it responded to my questions that each silo was strong and it did not see the immediate need for these groups to be talking to each other.”
COMPUTER RESOURCES
The direction many insurers are moving toward is one of more statistically based approaches, Swanstrom maintains. In the past, insurers may not have incorporated as many statistics as they do now, but he feels increased computing power has helped enable that change.
“In the past, if we wanted to deal with a large number of observations, there was only so much you could do in the time it took,” says Swanstrom. “With the improvements in technology, we now can take a huge number of claims in a database and get further insight.”
Some of the techniques being used by actuaries today are pushing computer resources to their limit, Otto observes. The mathematics behind predictive modeling techniques are not particularly new, but until lately, no one had the computing power to pull it off.
“Now, with the speed of computers, people can generate programs to handle this,” he says.
Bootstrapping requires thousands of simulations to be run. Ten years ago, that was difficult to do. Today, with a laptop computer, insurers can run these thousands of simulations in a matter of seconds to generate the results, remarks Otto.
“The mathematics is not really any kind of secret sauce one company knows and other companies don't, but there is a challenge, even with the speed of today's computers, to code something to handle it,” says Otto. “A lot of actuaries still are doing their reserving estimates in Microsoft Excel. You just aren't going to be able to do bootstrapping in Excel because it's not the right program to run 100,000 simulations and combine that with another line of business that has 100,000 simulations. It's difficult to manage if you try to do it with spreadsheet-type software.”
One area where Otto believes insurance carriers are more advanced than other industries is in maintaining data. “You find most businesses don't have nearly the amount of data insurance companies capture,” he says.
The problem carriers face is even though the speed of computers has gone up tremendously every year, insurers are throwing more and more data at the programs. “Is there a point in time when speed is no longer an issue?” Otto asks. “Every time I think that, either a new technique or additional data comes along and brings the machines to their knees.”
There have been some projects Otto has worked on recently where one calculation took 20 or 30 minutes and there were hundreds of calculations to do a complete analysis, he says. “Even today, the industry is struggling with speed. I suspect it probably will for a while. As speeds get greater, people are throwing more things at it.”
© Arc, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to TMSalesOperations@arc-network.com. For more information visit Asset & Logo Licensing.