In previous blogs, you’ve heard us talk about GIGO, which stands for Garbage In, Garbage Out. Bad data input results in bad data output—plain and simple.
How important is it to turn bad data into good data? It’s getting more significant every year. So much so, that we’ve been working with insurance brokers across Canada to see what types of data are valuable to them when it comes to forecasting the future of their commercial book.
So, where does one begin to turn bad data into good data? The first step is to look closely at your data entry processes. Here’s a few things we’ve pinpointed to get you started.
1. Identify areas of data redundancy
Data entry is time-consuming. Reducing the amount of pointless data commercial lines staff are collecting benefits everyone. One of the best ways to do this is by regularly revising existing applications and questionnaires to ensure the requested data is relevant.
For example, are some of your applications doctored-up versions of personal lines forms that contain data with little or no relevance to collecting quality commercial lines data?
Not only is that time-waster for your staff, it’s also wasting the time of underwriters who need to sift through garbage data they don’t want or need. That is, if they haven’t already shuffled your submission to the bottom of the pile because they didn’t have time to do that.
2. Identify common data entry errors
A common data entry error we see when it comes to commercial lines involves IBC codes. IBC codes can be tricky, especially when the insured or prospect works in multiple lines of business or their line of business is not familiar to the data entry staff at your brokerage.
If the person doing data entry is pressed for time or reluctant to ask for help, the IBC code will be left blank or they’ll guess. Blanks and guesses are two of the biggest contributors to garbage in, garbage out data. No data skews the reports you generate, and wrong data may lead to you to a wrong decision.
3. Haste makes waste
While speed is an important part of any workflow or process, accuracy trumps speed every time when it comes to data entry.
Data entry is done by humans, and humans make mistakes. Moreover, humans that are whipping through a detail-oriented task like data entry are bound to make more mistakes than ones who have adequate time to get it right the first time.
Of course, the speed someone is doing data entry should be monitored, because dawdling is a separate issue, but the quality of your data, not the quantity should always be your top priority.
Become more aware of any processes that seem to be slowing down your brokerage and find a way to improve it. No matter what you discover, any obstacle that prohibits the processing of your data clearing and accurately can cause major slowdowns, with significant impact to your bottom line.
Finally, just because a process or workflow has been done a certain way for years, maybe even decades doesn’t mean it’s foolproof or free of redundancy. In fact, it may be one way that GIGO data trickles in without anyone noticing until the dam of bad data has burst.