Equality and Human Rights Commission has published guidance about public bodies use of Artificial Intelligence and how they can comply with the Public Service Equality Duty.

The guidance is relevant to those procuring within public services - such as those building AI for their workplace or who have oversight or scrutiny of any service using AI - and those responsible for delivering or using AI as part of a service to a public body.

The guidance is highly relevant.  EHRC's 2022-2025 strategy has made tackling discrimination in AI a major strand of its 2022-2025 strategy.  This is, in part, because AI has a growing role in the public sector.  For example, AI is used to help allocate benefits, to estimate the risk of an individual committing fraud, or to assist the Police and Border Force.  Also, EHRC recognises the potential benefits and risks of AI in public services: "AI and new digital technologies are transforming how public services are delivered. They have the potential to improve equality, but they may also lead to discrimination." AI risks causing discrimination and deepening inequalities because:

  • the data used to help the AI make decisions may contain bias;
  • the AI may be developed in a way that is biased, sometimes in a way that accumulates over time.

Public Sector Equality Duty and AI

Public bodies are subject to the Public Sector Equality Duty (PSED).  If a third party is providing an AI system on a public body's behalf it may also be subject to the PSED.   

PSED includes two parts: the general duty and the specific duty.  Here we refer to the general duty only, but public bodies will need to consider how any specific duties apply.

The general duty requires public authorities and organisations carrying out public functions to have due regard to the need to:

  1. eliminate unlawful discrimination harassment and victimisation and any other conduct that is prohibited by the Equality Act 2010
  2. advance equality of opportunity between people who share a protected characteristic and those who do not share it
  3. foster good relations between people who share a particular protected characteristic and those who do not share it

What should public bodies consider?

EHRC has published a checklist of points public bodies may want to consider, although these will always need to be tailored to the circumstances:

  1. identify if and how you (or others on your behalf) use AI, and consider how the PSED applies

    Public bodies need to consider the PSED from the very start when thinking about whether to use AI.  The PSED also applies to any AI systems that public bodies are already using or that others may be developing or using on their behalf.

  2. collect equality evidence:
  • assess existing available information
  • look at external research
  • talk to staff, service users, equality groups and community groups
  • identify and address any data gaps
  1. review how the AI could affect people with different protected characteristics either positively or negatively

    Consider each protected characteristic. They do not have to be given the same weight.  If there are gaps in the data about a particular characteristic, that characteristic still has to be considered.  Is it possible to fill the data gap?
  2. assess the potential and actual impact by looking at the equality evidence and asking:
  • does the proposed or existing AI cause, or could it cause, discrimination?
  • does the proposed or existing AI help to eliminate discrimination?
  • does the proposed or existing AI contribute to advancing equality of opportunity?
  • does the proposed or existing AI affect good relations?

EHRC says that public bodies should consider engaging with particular equality groups before and after implementation in order to help assess the AI system and relevant policies.

  1. use the results of the equality impact assessment when developing the new AI-related proposal or reviewing existing services (even if the AI was developed outside of your organisation):
  • do you need to make a big change or adjust the proposal or services?
  • can you continue or should you stop and remove the proposal or service?
  1. make sure you consider the results of the assessment carefully when making the final decision about the service or product and how it will be put in place
  2. keep records of decisions and how you considered the PSED (for example, minutes of meetings)

    Consider keeping an equality impact assessment (EIA). There is no obligation to do so but it can help show how the PSED has been considered. Also keep a record of how any decision-makers have considered the EIA and any supporting documents.
  3. publish the results of the assessment to support transparency
  4. train staff and make sure they understand their responsibilities
  5. continue to monitor the actual impact of the AI-related policy or service, reviewing and amending it as necessary

EHRC notes that the PSED is an ongoing duty.  The AI system, and its policies, should be monitored regularly.  To support the analysis, EHRC says public bodies may consider using: engagement from staff and service users; complaints from service users; national research; court judgments; and feedback from other organisations using similar technology.

Procuring AI

There is government guidance on how public bodies should procure AI.  For example, the Office for AI has published guidelines on procuring AI.  It is clear from EHRC's guidance that equalities and human rights issues, and compliance with the PSED, should be considered before and during the procurement of AI.

There are specific contractual arrangements that could be put in place, too.  From an equality perspective, EHRC 'recommend [making] it a contractual requirement for the third party to comply with the PSED and provide the information you might need to monitor how the AI is working once it is implemented.'

If you would like to discuss how you procure, develop and deploy AI, please contact Tom Whittaker or Martin Cook.