KEY FINDINGS FROM EXPERIAN DATA QUALITY 2014 GLOBAL RESEARCH REPORT
By Joel Curry
Q1) What were the major differences between the responses from last year’s survey to this year?
Overall the report found that the average amount of inaccurate data rose from 17% in 2013 to 22% this year.
This is obviously disappointing, but unsurprising when you consider that there was an increase in organisations manually examining data – working through databases line-by-line, rather than utilising the range of data quality tools available on the market – 54% this year compared to 27% in 2013. There was also a decline in the use of point of capture software tools. In my view, this is a real step backwards, indicating that companies are not protecting their data assets at the point of entry. This is a real basic step in creating a corporate wide data quality process that’s fit for purpose.
Q2) Were you surprised to see the error levels in data entry not improve since last year’s report?
Not really, the proliferation of channels that individuals now use to interact with organisations (now around 3.2), represents a huge challenge for businesses seeking to create a single view of the customer. Despite the level of investment and discussion in channel shift initiatives the call centre remains a fundamental cornerstone in customer engagement. Although viewed as a necessary channel to collect information from a customer (54%), call centres are also described as the ‘dirtiest’ (52%). This may be due to the fact that by their very nature they use manual collection methods – with a significant risk of human error. This is still an area that represents one of the biggest data quality challenges.
In my view the test for any organisation is going to be how they design a data quality process across numerous channels that manage the full life cycle of data assets from the point of entry. Given the growth in demand from customers for a truly cross-channel experience, a co-ordinated approach to data capture across all channels will be critical for identifying the needs of customers and prospects and delivering relevant and timely communications as a result.
Q3) The report found that 99% of respondents have a data strategy in place. However, 94% also reported poor quality data. Where are businesses failing in their strategy?
Firstly, I am not totally convinced that 99% of companies have an effective data strategy in place. Furthermore, not enough businesses have a senior leader accountable for data. A Chief Data Officer is not a ‘nice to have’ – it is, in my view, a must have. It will become increasingly difficult for any business to achieve, or even maintain a competitive edge without strong ownership of data within an organisation.
A Chief Data Officer will help to ensure that data quality is the strategic priority for the business, with proper investment made in the people and resources needed to support it. Additionally, businesses need to actually use the tools available for analysing, improving and monitoring data quality. Until we move away from doing this manually we are not going to see any major improvements in the overall picture. It may seem cheaper in the short term, but the long-term impacts of poor quality data are far reaching, and ultimately, affect the top and bottom line.
For the financial services sector in particular, poor data quality has the potential to directly impact on the time taken to realise revenue. If, for example, one of the main channels of communication with customers and prospects is via email, it is essential to ensure that you are capturing correct data from the outset in order to protect your sender reputation. A simple comparison between UK and US sender scores (50.75:66.93 respectively)* highlights the gulf in email deliverability and therefore resulting ROI generation can also be assumed to be impacted. As a heavily regulated industry the financial services sector, in particular, needs to prioritise data improvement to remain compliant. Businesses need to deploy software tools to help them monitor, analyse and visualise data inaccuracy to avoid the pitfalls of poor data quality and management.
Furthermore organisations need to govern their data effectively. This includes the deployment of monitoring and visualisation tools that will help to depict and enable deep dives into root causes of data concerns to make necessary improvements.
Q4) What can businesses do to fix breakdowns in their data quality?
A good place to start is by taking stock – carry out a stringent review to get a true picture of the current state of your data to understand where you are. It’s pertinent to attach a financial value to those data inaccuracies, which will help the business to prioritise accordingly and address any weak spots in your existing data.
Once you’ve profiled your data and understood the underlying root causes you will be able understand the technology required to optimise and govern your data over time. Without taking this key step you’ll never realise the return on investment in technology in this space in the long-term. Technology plays a critical role ineffectively scoping out your requirements coupled with appropriate measures that will enable you to diagnose success and failure. Again, that’s where your CDO comes in.
When trying to fix these problems it’s important that businesses understand and collaborate around the data quality issues by assessing their data quality across the entire organisation, not just the IT department mission. After all, poor data quality affects the business users the most and lack of a combined view – never underestimate the impact that can have.
* The 2012 Return Path Sender Score™ Benchmark Report