The Centre for Data Ethics and Innovation has published their Review into bias in algorithmic decision-making; how to use algorithms to promote fairness, not undermine it.
As the report states, "data gives us a powerful weapon to see where bias is occurring and measure whether our efforts to combat it are effective; if an organisation has hard data about differences in how it treats people, it can build insight into what is driving those differences, and seek to address them.
However, data can also make things worse. New forms of decision-making have surfaced numerous examples where algorithms have entrenched or amplified historic biases; or even created new forms of bias or unfairness. Active steps to anticipate risks and measure outcomes are required to avoid this."
The report focuses on four sectors: policing; social services; finance; and recruitment. At 151 pages the report is detailed and makes recommendations to government, regulators and industry, both specific to the four sectors above and across sectors.
What good governance looks like for each sector will depend upon the legislation and regulation applicable to, and nature of, each sector. There will be some common threads - the Equalities Act, Data Protection Act and Human Rights Act. But there are likely to be sector-specific differences and the report recommends that government and regulators provide more help to organisations interpret that legislation. No one-size-fits-all to good governance.
In any event, the report provides some overarching guidance for those responsible for governance of organisations deploying or using algorithmic decision-making tools to support significant decisions about individuals - they should ensure that leaders are in place with accountability for:
• Understanding the capabilities and limits of those tools.
• Considering carefully whether individuals will be fairly treated by the decision-making process that the tool forms part of.
• Making a conscious decision on appropriate levels of human involvement in the decision-making process.
• Putting structures in place to gather data and monitor outcomes for fairness.
• Understanding their legal obligations and having carried out appropriate impact assessments.
The points above apply especially in the public sector when citizens often do not have a choice about whether to use a service, and decisions made about individuals can often be life-affecting.
The report emphasises that good, anticipatory governance is crucial. "Organisations need to make sure that the right capabilities and structures are in place to ensure that this happens both before algorithms are introduced into decision-making processes, and through their life. Doing this well requires understanding of, and empathy for, the expectations of those who are affected by decisions, which can often only be achieved through the right engagement with groups. Given the complexity of this area, we expect to see a growing role for expert professional services supporting organisations."
In practical terms, this means organisations may want to use impact assessments for algorithmic decision-making, like they do for Data Protection. The aim is to help identify risks in advance in order to minimise and manage them (such assessments were also discussed as practical solutions in response to the UK's National Data Strategy consultation and have been called for as part of a new Accountability for Algorithms Act). Impact assessments cannot become box-ticking exercises. "Assessments must not only consider the detail of how an algorithm is implemented, but whether it is appropriate at all in the circumstances, and how and where it interacts with human decision-makers."
However, the report reminds us that the risk of bias in algorithmic decision-making is just one issue that needs to be considered as part of good governance: "The issue is not simply whether an algorithm is biased, but whether the overall decision-making processes are biased. Looking at algorithms in isolation cannot fully address this."
What is clear is that, given the pace of change, and the wide range of potential impacts, governance in this space must be anticipatory.