Transforming Data into Opportunities: Metric of the Month – Data Quality Criteria
High-quality data is more than a benchmark – it is a strategic necessity for global trust, compliance and interoperability. In this blog, GLEIF's Head of Data Quality Management and Data Science, Zornitsa Manolova, outlines the role of Data Quality Criteria and its critical importance as part of GLEIF's Data Quality Management Framework.
Author: Zornitsa Manolova
Date: 2025-06-06
Views:
In an increasingly interconnected global economy, the ability for organizations to trust and use data effectively is the foundation for innovation, growth, and competitiveness.
A high-quality data ecosystem is a driver of change and innovation that enables organizations to identify and seize new opportunities, while low data quality can lead to inefficiencies and exposure to regulatory and reputational risks.
GLEIF is committed to optimizing the quality, reliability, and usability of LEI data. Since 2017, it has published dedicated monthly reports to transparently demonstrate the overall level of data quality achieved in the Global LEI System.
To aid broader industry understanding and awareness of GLEIF’s data quality initiatives, this new blog series explores key metrics included within the reports.
This month’s blog examines Data Quality Criteria.
At the heart of the GLEIF’s commitment to trusted data lies a robust Data Quality Management Framework that ensures Legal Entity Identifier (LEI) data remains complete, current, and reliable.
GLEIF’s Data Quality Management Framework is underpinned by the principle of Total Quality, which places the customer – in our case, the data user – at the center of all quality efforts. The framework is designed to directly reflect stakeholder requirements and ensure the highest possible standard of data quality across the Global LEI System.
To achieve this, GLEIF evaluates LEI data against clear Data Quality Criteria, including validity, integrity, and consistency, through more than 180 structured checks in total. These criteria uphold standards and establish measurable benchmarks for global interoperability. Ultimately, this promotes the use of high-quality LEI data, which increases trust and transparency throughout the global economy by empowering regulatory reporting, financial risk analysis, and know-your-customer (KYC) operations across various industries.
What is Data Quality Criterion?
A Data Quality Criterion defines a specific, measurable expectation or aspect used to evaluate whether a data record or data element meets an expected standard of quality.
To ensure that the criteria used to evaluate LEI reference data are relevant and impactful, GLEIF conducted an in-depth analysis of internationally recognized data quality concepts and standards. This informed the development of twelve distinct Data Quality Criteria to establish a transparent and objective benchmark to assess the level of data quality within the Global LEI System. These are: Accuracy, Accessibility, Completeness, Comprehensiveness, Consistency, Currency, Integrity, Provenance, Representation, Timeliness, Uniqueness, and Validity.
Each criterion allows for rule-based or algorithmic assessment to ensure consistency and scalability in its application. GLEIF applies these criteria to systematically assess LEI data against established benchmarks. These data quality checks are implemented as structured if-then-else logic rules, allowing for precise and automated validation of data elements. Each check is uniquely assigned to a single quality criterion, creating a clear and traceable link between the rules and their corresponding quality dimensions. This structure forms the foundation for monthly Data Quality Reports and public dashboards.
A spotlight on validity, integrity, and consistency
While all twelve of the Data Quality Criteria are essential, this blog focuses on the role of 'Validity', 'Integrity', and 'Consistency' in ensuring high-quality LEI data:
Validity: Ensure the right format and structure
Validity refers to the measure of how a data value conforms to its domain value set. It ensures that each LEI data element conforms to predefined formats and code lists. Validity is evaluated through 33 individual checks, which collectively achieved an Average Data Quality Score for this criterion of 99.99 in May. Checks include verifications that regional codes follow ISO 3166-1/2 (the international standard which defines codes representing names of countries and their subdivisions), or that a record is managed by an accredited LEI issuer.
In general, Validity improves system interoperability and reduces the risk of processing errors in automated workflows.
Integrity: Ensure logical soundness
Integrity comprises 29 checks and achieved an Average Data Quality Score of 99.98 in May, reflecting a high level of data integrity and precision. Dedicated checks include verifying whether reporting exceptions are distinct and do not have concurrent relationship records, or whether only one active international branch relationship exists per country.
Integrity prevents conflicting information in LEI records, helping users rely on the data with confidence in its internal logic. It refers to the degree of LEI data conformity to defined data relationship rules, examining whether required fields are populated appropriately, whether relationships between fields (e.g., parent-child entities) are coherent, and whether no contradictory or logically impossible combinations exist.
Consistency: Enforce uniform application
Consistency assesses the degree to which a unique piece of data retains the same value across multiple datasets. This ensures that legal forms and jurisdiction codes are applied uniformly and that similar entity types follow consistent naming conventions. It is supported by 25 dedicated checks, with an Average Data Quality Score of 99.99 in May. Checks include ensuring the declared Registration Authority code corresponds to the legal jurisdiction, or that fund entities use the appropriate entity category.
Consistent data ensures that LEI records are comparable and interoperable, regardless of where or by whom they are created. This facilitates accurate data aggregation, supports reliable cross-border analysis, and significantly enhances the analytical value and usability of LEI datasets for regulators, financial institutions, and other stakeholders.
Consistency is especially critical in a distributed system such as the Global LEI System. Without uniform adherence to common standards across jurisdictions, sectors, and entities, the integrity of the data would be compromised.
Transforming data into opportunities
Upholding data quality is essential for shaping the future of a connected, transparent financial ecosystem.
GLEIF’s Data Quality Criteria ensure LEI data is accurate, complete, and fit for strategic use across jurisdictions and sectors, with validity, integrity, and consistency fundamental to transforming fragmented information into a trusted, universal resource.
By setting clear benchmarks and providing transparent, measurable insights, GLEIF empowers users to make informed decisions with the confidence that accurate data supports their choices. These standards enhance data and enable systemic reliability in everything from regulatory reporting to risk analysis and digital identity management.
As the demand for verifiable, high-quality entity data grows, GLEIF’s commitment to high-quality data positions the LEI as a critical enabler of global digital trust.
If you would like to comment on a blog post, please identify yourself with your first and last name. Your name will appear next to your comment. Email addresses will not be published. Please note that by accessing or contributing to the discussion board you agree to abide by the terms of the GLEIF Blogging Policy, so please read them carefully.
Zornitsa Manolova leads the Data Quality Management and Data Science team at the Global Legal Entity Identifier Foundation (GLEIF). Since April 2018, she is responsible for enhancing and improving the established data quality and data governance framework by introducing innovative data analytics approaches. Previously, Zornitsa managed forensic data analytics projects on international financial investigations at PwC Forensics. She holds a German Diploma in Computer Sciences with a focus on Machine Learning from the Philipps University in Marburg.