One of the most common pain points I see with our clients is that they have bad, incomplete, or otherwise untrustworthy data and despite that, they keep reporting on it anyway. In the news recently, it was announced that Neilsen had been reporting on over seven months of bad data! This isn't an abnormal occurrence; 86% of companies admitted that their data might be inaccurate in some way . There are a multitude of reasons why this is terrible; namely, you can’t make accurate & impactful decisions based on data that you can't trust!
Every time I hear this, I go back to the old adage – garbage in, garbage out. If you know that your data is compromised, any insights and recommendations that you make are at best - useless, and at worst - misleading and dangerous.
As if that wasn’t bad enough, the implications don't stop there. Even if you’re analyzing data in a vacuum, providing shiny pebbles of information to a slightly interested audience, you are still putting yourself at risk of a reckoning from your constituents by the time you get everything fixed and they realize that what they’ve been seeing is not, in fact, the real deal. Having worked with a number of companies that find themselves in this situation, the driving force behind the poor decision-making becomes obvious - typically it’s on account of political pressure to deliver something, anything that resembles an analysis.
One of the biggest risks that they face when they finally do get their data in order is that everyone is furious that the data they’ve been seeing doesn’t match the new data – even if they forced the analytics team to deliver it! What’s one to do in this situation?
STOP THE PRESSES!
If you have a report that has bad data, pull the bad data out of it. If it’s just a section of the report, you’re best served to remove the data and add a caveat about why it’s gone if necessary – even including erroneous data in your charts, tables and analysis implies that it’s fit for consumption, and that is clearly not the case.
If your entire report is replete with bad data, you find yourself in the unenviable position of scrapping everything that you have and either adjusting your dataset or timeframe, or not delivering anything at all. Obviously, the first option is substantially better than the second, so I suggest that you do anything you can to find something to analyze that will be useful to your stakeholders, even if it doesn’t necessarily jive with what you’ve delivered up to this point. Some examples of alternate datasets that you can use to aid in analysis even when your analytics data is undergoing a total makeover include:
Voice of Customer data | Anyone with access to VoC data knows that this is an absolute gold mine – chances are, if you’re not reporting on it now and you add it in as a one-off, you’ll be delivering this analysis forever because it is just so interesting. If you don’t have a VoC tool like OpinionLab at your disposal, you can add a free tool like 4Q by iPerceptions.
Heat map data | Even if your analytics data is bunk, you might be able to leverage data from an external heat mapping tool like CrazyEgg or ClickTale. Both tools offer cheap solutions that you can use to get some information in the short term to tide you over until you can get something that you can really analyze.
Usability test data | If you can’t drag in a few willing participants to do a real usability test, you can use a site like UserTesting.com to get results from real users in a very short period of time, allowing you to add value with very little time spent.
Testing and optimization | Testing data does not rely on accurate analytics data in order to be successful. If you’re overhauling your analytics implementation, consider running a few tests to gain some insights in real time that can be applied to the site at any point in time.
Alternate analytics data | If you’re like many of my clients, you double-tag your site with your paid analytics tool and Google Analytics Standard. Even if your other implementation is a mess, you might still be able to use some of the data from Google Analytics in the interim.
Change your timeframe | Do you only have a week of good data? Then just use that and figure out a different way to context. Under no circumstances should you compare good data to bad data, as it will just provide more questions than answers.
Data you can take to the bank.
There is absolutely no doubt that having bad data completely hamstrings your ability to make real decisions. In a recent study it was found that 3/4 of businesses waste an average of 14% of their revenue due to poor data quality . The issues don’t stop at revenue, though; you also want to be sure not to damage your credibility while you’re trying to repair your reporting. Hopefully with some of these suggestions above, you can save the day while providing real insights, instead of just something to pacify your report consumers.
Have some other tips for what to do when your data’s a mess and you need to produce an analysis? Hit the comments!
 Global Research report, Experian 2014