Mark Twain’s cautionary quote on the misuse of statistics seems particularly appropriate at a time a year when loss profession professionals approach the “inventory season.” Twain warned against the use of numbers to convince, to bolster arguments, support a position to provide “evidence” of the “rightness” of an opinion. Twain viewed such use as a “deception” of the worst kind. At the time, Twain’s comments were directed at politicians and their misuse of statistically supported, but questionable “facts.” Regardless of the original intent, the warning remains relevant to retail loss prevention and caution should be applied to the use of statistics to support our goals and objectives.
This article does not suggest that loss professional professionals misuse statistics or that they misrepresent facts to serve their own purpose. Every industry relies on information gathered from research studies, statistical trends, and report publications. This information is used to evaluate, develop, and analyze programs and performance.
In loss prevention, statistics are valuable in the evaluation of investigative tools, audit development, exception analysis of POS data, and the other components of an effective loss prevention program. Statistics provide us with a benchmark to make better decisions on behalf of a retailer. The tale of caution, however, is we must remember that an apple is not always an apple. In short, statistics are used to evaluate our results and thus in an age of information we must refrain from searching for the “numbers” that best support the results we have or have not achieved.
My experience in the dangers of statistics was gained first hand.
Several years ago I worked as the Vice President of loss prevention for a national men’s apparel store. We operated over 400 specialty stores nationwide and the team had worked hard to reduce shrink. We benchmarked our results and highlighted our success through a comparison of a single national study. The annual study reported year-end shrink statistics and defined results by segments of the retail industry. It was and still is the primary benchmarking tool for retailers.
At the time, we had results worthy of our pride. Our 1.2% shrink was significantly better than that of our peers in men’s apparel, was better than the industry average, and was certainly warranted sharing with the CFO (my boss). The CFO was pleased with the program, pleased with the results, and had confidence that the department had advanced in the correct direction. And then one year a new report was published.
On that particular year, the study reported that the average for men’s apparel had improved to 1.2%. In a single number, the department’s results have slipped from outstanding to “average.” The CFO was no longer as confident and was disappointed that the industry numbers demonstrated that advancement had stalled. In his opinion, our shrink results demonstrated that program improvement was required. He questioned the focus of the loss prevention program, the capital budget, and the programs contribution to earnings (or more accurately the missed opportunities.).
The future of the department and the programs seemed less than certain. Loss Prevention and Finance began the painful detailed review of the strategic approach, the program components, and the team’s performance. Although such a process is actually a good thing, at the time it did not feel that way. During the process, however, the Assistant Director of Loss Prevention revealed a very small but critically important but overlooked “fact.” In the year that we had slipped from “outstanding” to “average” in our retail category there was only a single participant in that category. We had been beaten by our own results!
And of course we triumphantly attempted to share this revelation with the CFO. It was too late, the damage was done, and our program credibility had suffered. It was very disappointing to have lost so much, so quickly over one small oversight. In hindsight it was not the just the oversight that led to the trouble. True, greater care should have been taken with the review of the information. The real problem was that we had relied so much on a single study to determine our success that we lost sight of what the results actually meant to our program. As long as that study placed us in the lead, we ignored our responsibility to evaluate not only our program, but also the very facts and statistics we used in the evaluation.
The study’s results and the organization responsible were credible and reliable then as they are today. Our responsibility was to understand what those “facts” meant and to use caution in a direct application to our internal evaluations. So it was not a “data” or an “information” problem…it was an end-user error. We mistakenly used a single study to validate our claim of effectiveness. We interpreted the information narrowly (as did our CFO when the number was no longer outstanding). We only considered the significance of the sample size (retailers in a category) when that “fact” worked against us, and we applied the results to the larger industry without careful consideration as to whether such claims could be made.
In the upcoming months as you review your own results, please use this cautionary tale as you validate what works and what does not. Statistics are great when they work for us, but over reliance can make changes in those statistics a bitter pill. Here are a couple of tips:
Use more than one source of research when bench marking performance
Rely more on research that is subject to peer review and is reported in academic journals
Ensure that the process used to collect the data is consistent with rigorous scientific methods
Ensure that the sample size is statistically relevant.
These are just a few thoughts based on personal experience and lessons learned over a career in retail loss prevention.
Please share your comments, suggestions and insight into this discussion.
What sources and studies do you use to research loss prevention related statistics?
How do you use statistics to best present your success to your executives?
Steven May, President