Like many actuaries, I’ve spent a large part of my career designing and building—or using the results of—complicated models. Current developments in the regulatory and rating agency realms, as well as advances in technology, ensure that models will continue to play an important role in the P&C insurance industry. This could and should be very positive for the industry, but if we are not careful, it could perpetuate counterproductive behaviors. It is entirely possible that, as the use of models increases, so too will uncritical acceptance of the results they produce. This would be unfortunate, as models are most useful when users understand their proper role and limitations, maintain an appropriate level of skepticism toward them, and view their results as only one of many inputs to final decisions.
Models are general tools, not surgical instruments. The most useful ones do not provide answers or solve problems; they provide information. No model can contemplate every possibility, nor should anyone expect it to. There are many unlikely events that could happen and for which we should be prepared, but building such events into a model requires estimates of the likelihood of their occurrence, their potential severity and also some quantification of the probability that these events can occur in tandem. We simply cannot model some unlikely events. It is incumbent upon model result users to understand before they come to any conclusions what the model in question does (and does not) include, take into account that the model may well be missing something, attempt to determine what the missing pieces are and try to devise a method for incorporating them into the decision-making process.
It also is important to understand the underlying assumptions and how sensitive the model is to them. All models contain assumptions, which may or may not be correct, or which are based on sparse data; perhaps more importantly, the underlying assumptions may be reasonable in some circumstances but not in others, particularly in the extremities of the tail. To put it bluntly, one must understand a model before using its results as a basis for decision-making. This maxim seems obvious but is not always followed, given the complexities of model capabilities, workings and limitations, which only a small handful of executives may fully understand.
For example, one very important modeling issue of which decision-makers may be unaware is the distinction between “classical” probabilities and “subjective” probabilities. Classical probabilities are those that can be derived with some level of confidence through observation and experimentation, such as flipping a coin or administering a treatment protocol to a control group. However, property-catastrophe models, economic-capital models and many other widely used P&C models concern themselves with things such as claims propensities, hurricane landfalls, loss-reserve development and future interest-rate movements—all of which are subjective. These are areas in which we cannot perform experiments and can base estimates of the probabilities in question only on a combination of observations about the past and informed assumptions about the future. When reviewing model results, one needs to be aware of this important distinction and understand just what a model means when it says there is an X percent chance of a certain event happening within a particular time period. Unfortunately, many model users (and even builders) do not understand this distinction; they interpret the results of models that rely on subjective probability assumptions as facts instead of indicators.
Last, but certainly not least, one must be highly skeptical of models whose results claim high degrees of precision (what is often called “delusional exactitude”).
In summary, users must resist the temptation to accept model output as facts—especially if they don’t know the model’s key assumptions and/or the motivation for its construction. Models can be extremely useful tools, but they can never replace an educated, thoughtful human brain. In Warren Buffett’s words “beware of geeks bearing formulas.”
Paul Delbridge contributed to this article