Your Print MIS or ERP system holds the keys to your data empire. Reports, queries, BI tools, graphs and charts bring that data empire to life. There are so many high value inputs being captured by the MIS/ERP. Your company can use these inputs to make strategic and proactive day-to-day business decisions with. This corporate intelligence is often the competitive advantage that can be used to act swiftly and move efficiently when it comes to securing deals, producing products, and ensuring customer satisfaction. One of the most common answers I get when I ask "why are you switching Print MIS systems?" is "we need better data". The reporting and query tools in today's available Print MIS systems are better than they ever have been. If you're not taking advantage of using them, you are missing a massive opportunity to have better business intelligence at your fingertips. The other big mistake? Not validating data after initial builds.
I have seen many examples of bad output from good input, all because of missed data validation. Issues such as job jackets with incorrect stock, invoices with inaccurate subtotals, P&L statements with totals associated to the wrong account, and BI widgets displaying different datasets than intended. This is not meant scare anyone away from using these tools. Quite the contrary, I am a massive advocate for building the custom dataset your business requires. But just as we want quality control of our printed products, you need quality control on your data products. Bad data quickly loses trust of the user base and leads to workarounds and reporting outside of the system which becomes very difficult to claw back.
What needs to be validated?
Generally speaking any custom report or business intelligence (BI) widget should be authenticated for data integrity.
Here are some examples of when you need to validate data:
- Manually joining tables (as in a tool like Crystal reports/i-Net designer). Incorrect table joins is the number one mistake I see with Crystal reports yielding subtle or very obvious data errors.
- Formulas. These always need to be checked as it is easy to think you are calculating one thing when you are actually calculating something else. Let's say you need to calculate a percentage value of estimated to actual costs. It is no joke to say that you need to write the formula out as if you are answering a grade 9 math problem. Work it out manually to make sure your report formula calculates properly.
- Business Intelligence widgets. These can often look great but display the wrong information. If you have a chart showing revenue month after month, validate your data widget against your corresponding P&L's for the same month.
How can you validate?
- if you have a query tool within the Print MIS, you can use that as a great validation tool for the raw data within the report. As an example if you are using eProductivity Software's Pace system , you can use PaceStation to validate raw data of an iNet Crystal report you are building. I highly recommend this when new iNet users are starting to build reports. It is a great way to practice and ensure accurate data. Build your pacestation and then mimic the initial dataset using iNet.
- if you are implementing to a new PrintMIS system or financial #ERP, you should have a control set of reports from the legacy system you will validate the new system against when you go live
- for new reports, make sure you spot check several records in detailed fashion for all data values on the new report. If your report/data widget is summarizing data, don't just assume the percentages are correct, validate the raw data that is used to compile those summaries
Who should validate?
A very common mistake I encounter time and time again is having the author of the report/widget be the one to validate the data output. Don't get me wrong, it is the author's responsibility to validate all of the technical aspects of the output. However, a subject matter expert (SME) needs to be the one to review the output. You cannot expect a technical writer to subjectively understand how a report reads from a SME's perspective. As an example, if writing a financial report, the accounting person in charge of that data should review. For a KPI report, the leader of operations should review and so on. The SME who is the consumer of the data needs to be the one to help QC the report build. Often what they ask for may not be what they intended the output to be. There are terminology challenges between SME's and technical data writers. These often manifest in data output that is completely different than what the SME is looking for.
Do you have to keep validating?
Once all of your data has been validated and the report or widget has been signed off on by the SME and technical writer, it can be released to the user base. At that point, there isn't a need to keep validating. However, after upgrades if there are fields or tables that are impacted by an upgrade, the data output should be validated again.
Setting up a practice of checks and balances when authoring new reports or widgets will ensure your evolving data artifacts are accurate, reliable and trusted by the user base. It helps to build value within the Print MIS or ERP and prevents erosion in to Excel for reporting which is labor intensive and relies on specific human capital each time new output is needed.