Government

‘If it isn’t diverse, it isn’t ethical’: the United Kingdom’s approach to AI policy

‘If it isn’t diverse, it isn’t ethical’*

It isn’t often that we hear such a complex issue in the social aspects of technology articulated so succinctly and with such clarity. In a world where governments and organisations rightly see the need to set out their own approaches to AI in ways that reflect their values and culture, establishing or signing up to principles as they do, the complexity of the issues they grapple with often makes implementing changes to make technology ethical very difficult. And yet, we can all agree, that a way to make artificial intelligence more suited to the society it serves, and more ethical, is to make sure it is being designed and built by diverse teams that reflect those societies.

*The quote ‘If it isn’t diverse, it isn’t ethical’ came from Professor Dame Wendy Hall, UK AI Council Skills Champion and author of the independent UK AI Review, published in October 2017.

The accelerated impact of AI and government action

Artificial intelligence is perhaps unique among technologies for its propensity to be self-reinforcing in both its development and use. This potential for accelerated impact makes ethical considerations all the more important, which is why the UK has set up various AI institutions in government and empowered regulators to consider how AI affects their own areas.

In the UK, the government has established a number of institutions tasked with thinking about AI, all of which were announced in 2017. The Office for AI and AI Council both came from recommendations of the AI Review previously mentioned, with the former consisting of a team of civil servants tasked with taking forward AI policy and delivery, and the latter comprising an expert group of 22 leaders from AI industry, R&D and academia, and civil society. These institutions are complemented by the Centre for Data Ethics and Innovation, an expert advisory body that is independent of Government, and that draws on its own expert board and full-time staff. The Centre was not a recommendation of the AI Review but of the Royal Society’s and British Academy’s Data Governance report, also published 2017. In fact, ethics was not a focus of the AI Review, partly because the Centre for Data Ethics – which advises Government and Regulators on the use of data, including for (but not limited to) applications in AI – was conceived as the data stewardship body that would tackle all of the ethical questions that AI raises.

The importance of diversity in the AI workforce

But as is so often the case, the levers that exist to make positive changes do not all exist in the same place. The Office for AI, with its remit to improve the AI skills pipeline that the government committed to in the AI Sector Deal, wanted to address the lack of diversity in the AI workforce by intervening at a key point in the skills pipeline, by targeting underrepresented groups in education institutions at Masters level. But to do so it had to work with the Office for Students, an arm’s length body of the UK Department for Education.

To be reflective of the society it serves, AI needs to consider diverse populations and be developed by a diverse AI workforce. This is incredibly important: many segments of society are underrepresented, including women as a whole, Black AI workers, disabled AI workers, and those from lower socioeconomic groups. There is also a dearth of thought diversity, largely derived due to the fact that those working in AI and data technologies come from backgrounds that emphasise STEM – Science, Technology, Engineering and Maths – sometimes at the expense of social sciences, humanities and arts. This is why the Office for AI worked with the Office for Students (OfS) to conceive of and deliver AI conversion course postgraduate courses – designed to accelerate the number of highly skilled graduates by at least 2 500 by 2023, with 1,000 scholarships available to students from underrepresented backgrounds specifically women, Black students, disabled students. Courses are available at 28 universities across England and began delivery from autumn 2020.

The courses are unique as they offer flexible learning opportunities for students who are likely to have other work, caring responsibilities or mature students who may be considering retraining or upskilling for a new career. Paid work placements are also included to make these transitions more feasible for learners of different ages and stages of their lives.

Establishing the AI conversion course postgraduate and scholarships programme has been about implementing a tangible intervention to improve the UK’s approach to creating a sustainable and more importantly diverse skills pipeline. But it certainly does not stand in isolation. Our Guidelines for AI procurement in the public sector also calls for diverse teams to drive AI procurement, leaning on the ‘public sector equality duty’ in UK legislation called ‘The Equalities Act’ – which means that everyone in the UK has to receive the same quality of service from the state, regardless of any protected characteristic – to demand an ‘equality impact assessment’ alongside data impact assessments when AI solutions are implemented. And of course, the conversion course postgraduate courses and scholarships are one way to ensure there are enough specialists in government, including working in procurement, for the UK to ensure the delivery of diverse and ethical AI to its citizens.

What’s next?

The OfS commissioned an external evaluator, CRAC, to conduct an evaluation of the programme and their early findings show that the programme has had an impact on improving the diversity of the cohort. Student case studies are also being collated to showcase scholarship students’ experiences undertaking the courses.

Further reading:


AI Wonk Dog
Sign up for email alerts from the AI Wonk:

Building human capacity and preparing for labour market transformationHuman-centred values and fairnessInclusive growth, sustainable development and well-beingEducationAI ProductivityLabour MarketsSkillsUnited Kindom

Disclaimer :The opinions expressed and arguments employed herein are solely those of the authors and do not necessarily reflect the official views of the OECD or its member countries. The Organisation cannot be held responsible for possible violations of copyright resulting from the posting of any written material on this website/blog.

Sign up for email alerts