These tools and metrics are designed to help AI actors develop and use trustworthy AI systems and applications that respect human rights and are fair, transparent, explainable, robust, secure and safe.
Mozilla Open Source Audit Tooling (OAT) Project
Over the coming year, Mozilla is running the Open Source Audit Tooling (OAT) Initiative to identify the resources and tools needed to support algorithmic auditors, and to make thorough and consequential AI scrutiny the status quo.
Timeline
Phase 1 | Open Source Audit Tools Survey and Taxonomy
In this phase, we will articulate the need for algorithmic audit tools and gaps in audit tool development; map out the known design space for audit tool development in AI and other industries; develop a taxonomy of considerations; and map out categories of tools and the needed resources for development for these tools.
Phase 2 | Exploration of Interventions for Improved Audit Tool Development
In this phase, we will brainstorm potential interventions to address gaps in effective audit execution and tool development; assess the value of doing a public challenge to promote open-source development of algorithmic auditing tools and benchmarks; and map out opportunities for Mozilla to get involved in auditing space and operate a hub for open source audit tools and datasets.
Phase 3 | Implementation of Prioritized Intervention for Audit Tool Development
In this phase, we will plan and launch audit challenges and/or product development, invite participants to engage in an implemented solution; and launch workshop meetings with key audit tool development stakeholders.
About the tool
You can click on the links to see the associated tools
Developing organisation(s):
Tool type(s):
Objective(s):
Impacted stakeholders:
Type of approach:
Maturity:
Usage rights:
Target users:
Stakeholder group:
Benefits:
Geographical scope:
Use Cases
Would you like to submit a use case for this tool?
If you have used this tool, we would love to know more about your experience.
Add use case