Data Governance or Data Quality: not always a ‘chicken & egg’ problem

In this  blog with Datactics’ Head of Sales, Kieran Seaward, we dive into market insights and the sometimes-thorny issue of where to start.

kieran seaward market insights

Data Governance or Data Quality is a problem data managers and users will fully understand, and Kieran’s approach to this is influenced by thousands of hours of conversation with people at all stages of the process, all unified in the desire to get the data right and build a data culture around quality and efficiency. 

Following hot on the heels of banks, we are seeing a lot of buy-side and insurance firms on the road to data maturity and taking a more strategic approach to data quality and data governance, which is great.  Undertaking a data maturity assessment internally can throw up some much-needed areas of improvement regarding an organization’s data, from establishing a data governance framework, to updating existing data quality initiatives and improving data integrity. 

From what I hear, the “data quality or governance first?” conundrum is commonly debated by most firms, regardless of what stage they are at in a data programme rollout.

Business decisions are typically influenced by the need to either prioritise ‘top-down’ data governance activities such as creating a data dictionary and business glossary, or ‘bottom-up’ data quality activities such as measurement and remediation of company data assets as they exist today from data sources.  However, achieving a data driven culture relies on both these initiatives existing concurrently. 

In my opinion, these data strategies are not in conflict but complementary and can be tackled in any order, so long as the ultimate goal is a fully unified approach.  

I could be biased and say those market insights derived from data quality activities can help form the basis of definitions and terms typically stored in governance systems: 

data quality or data governance

Figure 1 – Data Quality first

However, the same can be said inversely, data quality systems can benefit from having critical data elements defined and metadata definitions to help shape measurement rules that need to be applied: 

data quality or data governance

Figure 2 – Data Governance first

The ideal complementary state is that of Data Governance + Data Quality working in perfect unison, i.e. :

  • Data Governance system that contains all identified critical data elements as well as definitions to help determine which Data Quality validation rules are applied to ensure they meet the definitions;
  • Data Quality platform that validates data elements and connects to the governance catalogue to understand who the responsible data scientist or data steward is, in order to push data to them for review and/or remediation of data quality issues.
    The quality platform can then push data quality metrics back into the governance front-end that acts as the central hub/visualization layer displaying data visuals. This either renders data itself or through connectivity to third parties such as Microsoft PowerBI, Tableau, or Qlik. 

data quality or data governance

Figure 3 – The ideal, balanced state

In the real world, this decision can’t be made in isolation of what the business is doing right now with the information they rely on:

  • Regulatory reporting teams have to build, update and reconfigure reports in increasingly tighter timeframes.
  • Data analytics teams are relying on smarter models for prediction and intelligence in order to perform accurate data analysis.
  • Risk committees are seeking access to data for the client, investor, and board reporting.  

If the quality of this information can’t be guaranteed, or breaks can’t be easily identified and fixed, all of these teams will keep coming back to IT asking for custom rules, sucking up much-needed programming resources.

Then when an under-pressure IT can’t deliver in time, or the requests are conflicting with one another, the teams will resort to building in SQL or trying to do it via everyone’s favourite DIY tool, Excel. 

Wherever firms are on their data maturity model or data governance programme, data quality is of paramount importance and can easily run first, last or in parallel. This is something we are used to helping clients and prospects with at various points along that journey, whether it’s using our self-service data quality & matching platform to drive better data into a regulatory reporting requirement, or facilitating a broad vision to equip an internal “data quality as-a-service” function.

My colleague Luca Rovesti, who heads up our Client Services team, goes more into this in Good Data Culture

I’ll be back soon to talk about probably the number one question thrown in at the end of every demo of our software:

What are you doing about AI?

Click here for the latest news from Datactics, or find us on LinkedinTwitter or Facebook