Reminiscent of the scene in Alfred Hitchcock's “The Birds,” whenthe sky turns black with attacking avians, I often feel as if I'mbeing attacked by throngs of bad statistics when I read today'sinsurance and business press. “The Birds” was just a movie, but badstatistics can draw real blood. Bad statistics can lead to badstrategies and bad decisions. For proof, look no further than oureconomic malaise. While some people rely on statistics, othersavoid statistics at all costs. Some people believe BenjaminDisraeli's famous saying, “There are three kinds of lies: lies,damned lies and statistics.” I would qualify to say that badstatistics are damned lies, while good statistics are key tointelligent management and a huge competitive advantage. This makesit essential to discern between the two. Sometimes advancededucation is required to identify the good from the bad, butusually common sense will suffice. Here are some examples:Check the data source
I recently examined aPowerPoint from an agency consultant. I was jealous of hisbeautiful graphs that were so artistically crafted. After lookingmore closely, though, I realized they were just art. There was noreal data behind them. The consultant had started with how hewanted the charts to look, then created the data that wouldgenerate his desired charts. He had not gathered any real data.The lesson: Always check the data source. The bestdata is random. If it's not random, suspect it. Be particularlycautious if the organization that created the study also is sellingsomething because its study is more likely to be biased.

Beware of averages
Studies presentingonly averages are of little use because the information regardingwhat creates better or worse performance is absent, as is variance.Averages are not usually adjusted to the median, so a few scorescould greatly skew the result. And even if you see averages showingthe numbers for the top performers or worst performers, that dataoften is meaningless because the numbers are just averages. Manyfactors affect averages–without more information, how do you knowwhat to do about it?

For example, an agency was comparing its producers with a studyshowing the average producer's book was $300,000. Was its $250,000producer a poor performer? Possibly, except she made $250,000 inhalf of the time. How about another producer with $350,000? Is hegreat? Perhaps, except his book was given to him. Possibly thebiggest problem with averages is the general assumption that everystatistic can be plotted along a normal curve. Some sophisticatedusers exacerbate this mistake by assuming normal levels ofconfidence and variation at either tail. The truth is not allcurves are normal and we know that in reality, especially infinancial markets, even when using a normal curve, the extremetails do not always work the way a normal curve predicts.The lesson: When a study only shows averages, verylittle weight should be placed on the results if one is trying todetermine cause and effect. Context iscritical
Changing the context enables people topurposely mislead others. For example, if I wanted to pump up anagency owner, I could expound on how his agency's 88 percentretention rate was great, without mentioning that most of hiscompetitors are doing even better. At 88 percent retention, thereality is the agency is likely doing something wrong. Anothergreat example is, “Our producers are awesome! They each wrote morethan $200,000 in new commissions last year!” Are the producers newor established? Is this a small or large agency? Does this includeprogram business? And most importantly, what difference does itmake how much new business the agency writes if all of it goes outthe back door? I've met awesome new business producers who couldwrite the equivalent of 30 to 40 percent of their books in newbusiness each and every year. Of course, their retention rates werearound 65 percent. New business only counts if it is retained.These are important issues that greatly affect theterrific-sounding “$200,000 in new business” statistic. Thelesson: Check the context of the statistic. Do factorsexist that might mitigate the statistic's usefulness?Mixing & matching
Be cautious of studiesthat compare apples to oranges. A common and very misleadingmismatch is to use EBITDA (earnings before interest, taxes,depreciation and amortization) to compare companies growingorganically to those growing by acquisition. The problem is whileEBITDA excludes almost the entire cost of acquisition growth, itdoes not exclude the cost of organic growth. I do not have space togo into all the details here, but suffice it to say that whencomparing the two types of growth using EBITDA as a measure, theacquiring firm's EBITDA is virtually guaranteed to look hugelybetter. The key is to look at cash flow. When firms groworganically, their cash flows and profitabilities usually are veryclose to the same. When a firm grows by acquisition, cash flowoften is much less than profits. But because too many people do notunderstand the implications of EBITDA versus cash flow, a lot ofbad acquisitions, growth decisions and even loans have beenmade.

Continue Reading for Free

Register and gain access to:

  • Breaking insurance news and analysis, on-site and via our newsletters and custom alerts
  • Weekly Insurance Speak podcast featuring exclusive interviews with industry leaders
  • Educational webcasts, white papers, and ebooks from industry thought leaders
  • Critical converage of the employee benefits and financial advisory markets on our other ALM sites, BenefitsPRO and ThinkAdvisor
NOT FOR REPRINT

© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.