These tools and metrics are designed to help AI actors develop and use trustworthy AI systems and applications that respect human rights and are fair, transparent, explainable, robust, secure and safe.
Normalized Mutual Information is a metric calculated between two clusterings and is a normalization of the Mutual Information (MI) score to scale the results between 0 (no mutual information) and 1 (perfect correlation).
NMI can support Data Governance & Traceability by enabling the comparison and validation of clustering results during data processing or model auditing. This helps ensure that data transformations or clustering steps are consistent and traceable, which is important for oversight and accountability in AI systems. However, this connection is indirect and context-dependent, as NMI itself does not provide governance or traceability but can be a tool within such processes.
About the metric
You can click on the links to see the associated metrics
Objective(s):
Purpose(s):
Lifecycle stage(s):
Target users:
