Don't Believe All the Insurtech Studies
Susanne Sclafane, the editor of Carrier Management, wrote a great article many months ago about an interesting insurance study that turned out to be fake (https://www.carriermanagement.com/features/2021/08/27/225570.htm). This is a story well worth reading.
A follow-up regarding the same subject was reported in other non-insurance publications. A particular carrier seems to have supplied the data, which was categorically faked. The data was not just wrong, it was proven to be fake. Someone made up the data out of thin air. Of course, no one remembers who provided the data, what happened to the data, who was involved collecting the data, or even if this was the data.
The researchers were Gold Medal level analysts, and they completely missed the deception. One important clue was that the fake data was in one font and the real (possibly real) data was in another font. That is how fake the data turned out to be. Regardless, the fakery was missed and this oversight is important.
So much is being made of new insights into how to build carriers and distributors because "We now know how to make people drive more safely" or "Work more safely" or whatever else the industry is promising to change. If the industry is changing and millions of dollars are being spent on fake data, not only is time and money being wasted, but so is opportunity. This particular piece of work reportedly involved a major insurance company, top researchers working at major universities, and reportedly just under 7,000 actual drivers (before they began creating imaginary drivers).
Fortunately, three other researchers at Data Colada discovered and then reported the fakery.
Besides the opportunity to share a gossipy piece of news, why report this here? First, the story needs to be spread as far and as wide as possible to hopefully make readers more aware and, with enough distribution, keep others from cheating due to the knowledge that Data Colada may catch them too.
However, these kinds of studies happen all the time in the insurance industry and sometimes the data is faked. Maybe somewhere someone has the actual study that shows that if people buy multiple policies, retention increases (I argue this is a correlation and not causation because I cannot find anyone to provide proof of causation and I know of two studies where causation failed to be completely proved, only correlation was proved and the difference is critical when someone orders, "Go sell another policy so our retention increases" if that is not the case.).
Another example lasted for years because the authors had no idea their data made no sense. They were doing the math incorrectly. Their intentions were great, but their statistical knowledge was poor. A clue was that the totals in the study exceeded actual totals by about 20%.
Another example of a statistical mistake is not knowing the difference between when a normal (bell) curve should apply versus, for example, a Pareto curve. In the faked study, one of the clues was the curve was shaped like a box. Given the type of data being analyzed, the curve should have been a bell curve. Even if the data was not faked, the resulting curve should have been a great big clue.
Another clue to questionable data is the simple application of averages and quartiles without context. In this industry, averages and quartiles only have application in quite specific instances. Very bad decisions are being made by executives that don't know what these instances are. Huge opportunities are missed not knowing the right applications. The way carriers design contingencies is an excellent example of really poor understandings.
Firms like Data Colada can't check all these reports. I analyze and check what I can for my clients, and we have used some of those findings to our client's great benefit. That said, before you begin spending money or before you change your strategy because you saw some great presentation, verify that what you're seeing is reality and not fakery.
NOTE: The information provided herein is intended for educational and informational purposes only and it represents only the views of the authors. It is not a recommendation that a particular course of action be followed. Burand & Associates, LLC and Chris Burand assume, and will have, no responsibility for liability or damage which may result from the use of any of this information.
None of the materials in this article should be construed as offering legal advice, and the specific advice of legal counsel is recommended before acting on any matter discussed in this article. Regulated individuals/entities should also ensure that they comply with all applicable laws, rules, and regulations.