What is Data Quality and why does it matter?
Data quality refers to how fit your data is for serving its intended purpose. Good quality data should be reliable, accurate and accessible.
What is Data Quality and why does it matter? Read More »
A collection of common terms, themes and topics used in the data management ecosystem.
Data quality refers to how fit your data is for serving its intended purpose. Good quality data should be reliable, accurate and accessible.
What is Data Quality and why does it matter? Read More »
Data remediation: Identifying and correcting errors, inconsistencies and inaccuracies in data to ensure quality and accuracy.
What are Large Language Models (LLM) and GPTs? Read More »
What is a Data Quality firewall? A data quality firewall is a key component of data management. It is a form of data quality monitoring, using software to prevent the ingestion of messy or bad data. It’s a set of measures or processes to ensure the integrity, accuracy, and reliability of data within an organisation,
What is a Data Quality Firewall? Read More »
Year after year, the volume of data being generated is increasing at an unparalleled pace. For businesses, data is critical to inform business strategy, facilitate decision-making, and create opportunities for competitive advantage. However, leveraging this data is only as good as its quality, and traditional methods for measuring and improving data quality are struggling
What Is Augmented Data Quality And How Do You Use It? Read More »
Data remediation: Identifying and correcting errors, inconsistencies and inaccuracies in data to ensure quality and accuracy.
What is Data Remediation? Read More »
Data integrity is the process of maintaining the accuracy and completeness of data over its entire life cycle and how it is applied..
What is Data Integrity? Why is Data Integrity Important? Read More »
ETL refers to the process of extracting data from a source system, transforming it into the desired format, and loading it into a target system.
What is ETL (extract-transform-load)? Read More »
Data observability is the ability to see and understand data as it flows through an organization, enabling professionals to track metadata issues
What is Data Observability? Read More »
This post defines and explains the differences between a Data Lake, Data Warehouse and Data Lakehouse
What is a Data Lake, Data Warehouse and Data Lakehouse? Read More »
The Gartner Magic Quadrant provides a graphical depiction of different types of technology providers and their position in fast-growing markets..
What is the Gartner Magic Quadrant? Read More »
Artificial Intelligence is the application of computer science techniques to perform a range of decision-making and prediction activities.
What is AI and ML? Read More »
Environmental, Social and Governance refers to a collection of criteria used to evaluate an organisation’s operations and measure their sustainability.
KYC and AML are fundamental components of regulatory compliance in financial institutions, referring to the prevention of money laundering and other financial crimes.
What is KYC and AML? Read More »
Customer 360 refers to a 360-degree view of a customer’s journey through an organisation, including accounts, interactions and enquiries.
What is Customer 360 or Single Customer View? Read More »
Microsoft Power BI is a technology-enabled business intelligence platform for gathering, analysing and visualizing data.
Metadata is a way to describe and make sense of data. It’s a shorthand representation of the data, which helps data stewards easily understand the information in front of them.
Data profiling is the process of reviewing data, including its source, to provide helpful summaries of information about the data, including potential data quality issues.
What is Data Profiling? Read More »
Launched in 2010, Microsoft Azure is one of the world’s leading public cloud computing software, offering over 200 preconfigured services including AI, storage, networks and integration.
What is Microsoft Azure? Read More »