Catalogue of Tools & Metrics for Trustworthy AI

These tools and metrics are designed to help AI actors develop and use trustworthy AI systems and applications that respect human rights and are fair, transparent, explainable, robust, secure and safe.

The demographic disparity metric (DD) determines whether a facet has a larger proportion of the rejected outcomes in the dataset than of the accepted outcomes. In the binary case where there are two facets, men and women for example, that constitute the dataset, the disfavored one is labelled facet d and the favored one is labelled facet a. For example, in the case of college admissions, if women applicants comprised 46% of the rejected applicants and comprised only 32% of the accepted applicants, we say that there is demographic disparity because the rate at which women were rejected exceeds the rate at which they are accepted. Women applicants are labelled facet d in this case. If men applicants comprised 54% of the rejected applicants and 68% of the accepted applicants, then there is not a demographic disparity for this facet as the rate of rejection is less that the rate of acceptance. Men applicants are labelled facet a in this case.

Trustworthy AI Relevance

This metric addresses Fairness and Data Governance & Traceability by quantifying relevant system properties. CDD directly supports Fairness because it quantifies group-level disparities in outcomes after controlling for relevant covariates (e.g., qualifications, risk factors), which helps detect residual discrimination that unconditional metrics may miss. It supports Data Governance & Traceability because implementing CDD requires well-documented data definitions, provenance of conditioning variables, and reproducible audit procedures — making the measurement part of an auditable fairness governance process.

About the metric





Risk management stage(s):

Modify this metric

Partnership on AI

Disclaimer: The tools and metrics featured herein are solely those of the originating authors and are not vetted or endorsed by the OECD or its member countries. The Organisation cannot be held responsible for possible issues resulting from the posting of links to third parties' tools and metrics on this catalogue. More on the methodology can be found at https://oecd.ai/catalogue/faq.