This month’s topic may affect the long term quality and accuracy of data within any web analytics program regardless of whether it is software or ASP based and independent of the method of data collection (log files, page tags, network, or hybrid) being leveraged. The term I like to use to describe this phenomenon is Data Quality Erosion, which I have defined below:
“Data Quality Erosion is the decay in the quality of a web analytics data set when the web analytics implementation and the website are not proactively managed.”
In the remainder of this newsletter, I will address the primary causes and effects of Data Quality Erosion and some specific strategies to prevent it from affecting your web analytics data.
Primary Causes / Effects
Filtering requirements can change over time. Examples include:
Best Practices to Prevent Data Quality Erosion
Like many best practices within the web analytics industry, the most effective approach for minimizing the potential for data quality erosion is to adopt a proactive strategy to preempt it. A cornerstone of this is to adopt a proactive approach to maintaining your web analytics implementation. As part of this process you should plan to regularly conduct the following:
In addition to the creation of a proactive maintenance program for your web analytics system, it is critical to develop and integrate the appropriate business processes into your organizations web development process to ensure that web analytics requirements and the appropriate data collection is captured as part of the new project development process.
By implementing the recommendations above you can ensure the long term quality of your web analytics data and focus your efforts on extracting value from your web analytics data as opposed to spending cycles explaining data issues to the consumers of web analytics data within your organization.
Bill Bruno is the CEO - North America, Ebiquity.