Vienna, November 2020 – At the AI for Good Global Summit 2020, the UNODC Global Judicial Integrity Network won an award in the Gender Equity Breakthrough track for its project focusing on preventing gender bias and discrimination in artificial intelligence (AI) systems used in judiciaries.
Predictive Artificial Intelligence (AI) applications like the ones being used in justice programs often end up perpetuating pre-existing biases as they rely on biased historical case data. The award-winning project aims to develop and maintain a practical framework of global recommendations for judiciaries on how to develop AI systems which do not replicate or exacerbate gender-based discrimination. This framework will include:
- Set of global recommendations to identify how judiciaries can use AI ethically and in a gender-sensitive manner;
- Repository of good practices in open source format, continuously updated, with a collection of case studies, solutions, experiences and datasets;
- Self-assessment guide for judiciaries to evaluate their AI applications against global recommendations.
The Global Judicial Integrity Network, officially launched in April 2018, aims to assist judiciaries across the globe in strengthening judicial integrity and preventing corruption in the justice sector, in line with article 11 of the United Nations Convention against Corruption. The project is a continuation of the Global Judicial Integrity Network’s previous work on gender-related issues in the judiciary, as well as their work on the ethical use of AI.
For more information on the project, watch the presentation at the “AI for Good“- Global Summit 2020:
The Global Judicial Integrity Network is looking for Technical Experts and Resources for Consultations on Gender-Related AI Considerations
During the consultation phase of the Global Judicial Integrity Network's project on addressing gender-related issues in AI systems, technical experts are needed to help identify concerns and provide suggested solutions for ensuring that protected attributes (gender) do not become a factor in the outputs of judicial AI applications. For instance, if biased case data is provided, in which women's testimonies were discredited, this should be adjusted for in the application. Furthermore, the experts would advise on potential gender bias related concerns in AI applications in the judiciary more generally as well as on potential ways to address these issues at virtual roundtable meetings.
Experts may reach out to
Roberta Solis, Judicial Integrity Team Leader, UNODC