Data is often difficult to get and even harder to maintain. The sad truth is that most organisations don’t have accurate product, pricing, or client information in their systems. And even when the information is accurate, it is often not consistent or easily accessible.
Without quality data, you will suffer from inconsistent predictive analytics, AI, and data visualisation. It’s time for organisations to fall back in love with good data. But how can you reignite the spark?
Make the case for data quality
It can be easy to attribute poor data quality to technological inconsistencies and flaws. But just like an iceberg, data quality issues run much deeper than they appear on the surface. While top execs can see these issues as manageable, data experts know there are more concealed complexities.
With so much data being pumped out at breakneck rates, it can seem like an insurmountable challenge to ensure data accuracy, completeness, and consistency. And despite technological, governance and team efforts, poor data can still endure. As such, maintaining data quality can feel like a perennial challenge. But quality data is fundamental to a company’s digital success.
To create a business case for embracing data quality, you have to, firstly, demonstrate the far-reaching consequences of poor data quality on organisational performance. If you can present the problem from a business (revenue, cost, risk) standpoint — backed by evidence and real-world scenarios of data quality issues leading to incurred costs, reputational risk, and uncapitalised opportunities — you can implement proactive measures and trigger a desire by top-level management to adapt processes.
To bring your case to life, you then must find ways of quantifying the business impact of data quality issues. This could take the form of illustrating the effect of bad data on a marketing campaign, showing the difference with and without data quality in relation to usable records, sales leads, and how this impacts your revenue. These use cases speak for themselves and go a long way in convincing management to implement a global strategy.
Time to form your data quality strategy — and act on it!
Technical data quality and organisational challenges go hand in hand. Therefore, it makes sense to create team structures that reflect this. Organisations can form pods, which are teams comprising representatives from both departments affected by data quality issues and technical and analytics departments. This allows you to optimally manage both the organisational and technical aspects of data quality.
These pods can then identify use cases by working with respective departments to make sure their requirements and needs are funnelled into the data quality strategy. They should also implement well-defined Key Performance Indicators to gauge the efficacy of data quality initiatives.
To succeed, it’s essential for pods to have a centralised data platform. This allows them to work together on the same datasets, establish a common working language with easy-to-use tools and, ultimately, foster a unified understanding. The result? Consistent, accurate data which is easily accessible and much easier to update.
While all of this goes a long way to substantially increasing the quality of data, it is important to note that it is nigh on impossible to get ‘perfect data.’ There are so many variables involved in the data lifecycle that the aim instead is to create a system that works as efficiently and effectively as possible. Striving for ‘perfect’ will only lead to you getting bogged down in details and delivering less value in not using what you have.
A successful data quality initiative in action
Unilever set out to create personalised digital relationships with its 3.4 billion consumers worldwide. This ambition entailed understanding consumer data like behaviours, interests, and demographics across 38 Digital Hubs. The sheer scale of the challenge was full of obstacles, such as managing vast amounts of unstructured data pouring in from different digital channels, a lack of common consumer identifiers, and Unilever’s need for non-data experts to use this data compliantly.
So, to take on the data quality challenge, Unilever implemented a centralised end-to-end data platform to facilitate a collaborative approach. The company formed a lean pod tasked with creating primary data pipelines, automating data cleansing, standardisation, and accuracy. Through the platform, they could also join up consumer datasets to produce a Unified View of Consumers (UVC) alongside building a simple web app — Audience Creation Tool (ACT) — empowering non-technical users to create consumer audiences.
By adopting this agile and collaborative data quality strategy, Unilever was able save around 175 hours per month of data scientist time, reduced audience creation time by 98 per cent, and, crucially, improved data accuracy by an impressive 50 per cent. Furthermore, with the platform democratising data access across the company, anyone could execute an audience, fostering greater collaboration and efficiency.
Fall back in love with data quality
Data quality should be your business’ love language, creating better decision making, boosting your reputation, and allowing you to capitalise on more opportunities. If you can successfully make the case for data quality, form a strategy with pods and implement it using a central data platform, then you can start to drive value and bring people on board. The art of data quality relies on a culture of continuous improvement and monitoring — it is a constant process, and pods need to refresh existing data while tackling a variety of further use cases.
Of course, the value of data comes from using it. So, if it can be easily accessed from a central location, departments can gather more insights, build better products, and harness data’s true potential. By enhancing your data quality, you can reignite a profound appreciation of data within the company and fall in ‘data love’ all over again.
David, Talaga, product marketing director, Dataiku