By Joel Curry, Managing Director of Experian Data Quality.
A (gradually) recovering economy, high levels of competition, and the proliferation of channels for reaching target audiences all mean that businesses are under greater pressure to perform. A substantial part of coping with this pressure means ensuring that a business’s data is given the attention it deserves.
There is no doubt that data quality issues impact the bottom line of organisations – a 2013 Gartner survey estimated that businesses were losing an average of $14.2 million annually because of data quality issues.
Treating data as a strategic asset can not only help develop the understanding of the connections between customer, product, and transactional data, but can also give real competitive advantage and be integral in achieving corporate objectives.
WANT TO BUILD A FINANCIAL EMPIRE?
Subscribe to the Global Banking & Finance Review Newsletter for FREE Get Access to Exclusive Reports to Save Time & Money
By using this form you agree with the storage and handling of your data by this website. We Will Not Spam, Rent, or Sell Your Information.
Over the years various attempts have been made to improve and govern the level of data quality across organisations – from the identification of common spelling mistakes to clearing potential duplicates, with varying degrees of effectiveness and success.
But now companies need to go further. It’s not enough to simply have accurate, de-duplicated data. Data enrichment provides an opportunity to enhance records with additional information to ensure customer engagement is as personalised as possible – which is something that consumers are increasingly expecting as the norm. According to Defaqto Research, 55% of consumers would pay more for a better customer experience, and 70% of buying experiences are based on how the customer feels they are being treated (McKinsey).
In addition to a growing need to comply with tightening data governance legislation, business are also increasingly recognising that data quality tools can enable innovation, increase operational efficiency, and reduce risk and cost. Consequently we are seeing greater investment in data quality management and the data tools available.
In 2013 the data quality market was worth over $1 billion and with Big Data, cloud and mobility trends forcing the issue, it’s only going to get bigger. While commercial take up seems fairly low at the moment, growth will start to accelerate and so too will the opportunities.
According to Gartner, growth in the market for data quality technology is accelerating because these tools are increasingly recognised as critical infrastructure, with a prediction that “data quality tools will reach mainstream adoption in less than two years.”
Already we are seeing growth of industry specific data quality products in areas such as insurance claims management or credit card fraud detection, and the type of data that businesses are prioritising is broadening.
While customer data remains the number one focus of data quality initiatives (79%), other data types, such a transactional, financial, location, and product, are all gaining pace. Data profiling functions and the ability to ‘visualise’ the data (rather then staring at rows and rows of information) are fast becoming the ‘must haves’ of the data quality world. In 2013, 48% of data quality tool users used data profiling, up from 35% in 2012; and 35% used visualisation of data quality metrics, up from 28% in 2012.
While on-premise data quality management tools are still the most common, offsite and cloud based tools (often in the form of Software as a Service) are playing an increasing role in how data quality capabilities are delivered. Although the numbers are small in comparison to onsite tools, this area increased from 13.3% in 2012 to 21% in 2013, and given the increasing range of data quality capabilities available in these formats, this looks set to carry on growing.
A word of warning though – companies need to choose their data quality tools and vendors carefully. Gartner predicts that by 2016, 25% of organisations using consumer data will risk damage to their reputations because of their inadequate understanding of information trust issues.
And while IT leaders have already realised that data quality is essential when it comes to Master Data Management (MDM), and are starting to recognise the importance in terms of gaining value from Big Data investments, this is yet to be reflected in buying behaviour relating to data quality tools.
In addition, at present only a minority of organisations currently place the necessary the importance on data quality when it comes to other business information initiatives such as business intelligence (BI) and analytics, data migration and business-application-centric programs, including CRM and ERP.
But with forecasts of a ‘Data Doomsday’ – the point at which companies are so overwhelmed by data that they are frozen into inaction – by 2107, it’s clear that businesses must get their houses in order now to be able to cope with the data challenges of the future.