The Digital Regulation Cooperation Forum (DRCF) – a working group of the FCA, ICO, CMA and Ofcom –  has published a working paper setting out its plans and priorities for the next year.  DRCF has also published a paper on the current and future landscape of algorithm auditing and the role of regulators. 

Here we provide an overview on the increasing use of algorithm audits and increased engagement from regulators to promote effective algorithm audits.

DRCF sets its sights on algorithmic transparency

Algorithm auditing is clearly in the sights of the DRCF.

The DRCF was set up in July 2020 and consists of The Competition and Markets Authority (‘CMA’), the Information Commissioner’s Office (‘ICO’), the Office of Communications (‘Ofcom’) and the Financial Conduct Authority (FCA). Together, they aim to create and maintain a consistent approach when it comes to regulation of issues such as privacy, competition and data protection that arise within a rapidly developing digital landscape. 

The Forum has three goals:

  • to promote greater coherence, so that where regulatory regimes intersect the DRCF helps to resolve potential tensions, offering clarity for people and industry;
  • to work collaboratively on areas of common interest and jointly address complex problems; and
  • to work together to build the necessary capabilities, learning from what each regulator is doing and striving to be best in class, both now and in the future.

DRCF's collaboration goal includes:

  • supporting improvements in algorithmic transparency; and
  • enabling innovation in the industries regulated by those that make up DRCF.

In order to effectively assess algorithmic systems and support the appropriate development and deployment by businesses DRCF will:

  • improve capabilities for algorithmic auditing by sharing knowledge on different auditing techniques, as well as understanding and testing how to use digital solutions to monitor algorithmic processing systems and identify harms;
  • research the third-party algorithmic auditing market and assess where regulators can play the most valuable role in how this emerging market develops; and
  • promote transparency in algorithmic procurement. Support vendors and procurers through a publication on best practices, harmful behaviours, and clarity on each regulator’s role.

The DRCF's plans show that there is more to come for algorithm auditing. Whether and to what extent regulators in the UK converge, or diverge, in their approaches remains to be seen.  But the DRCF plans suggest a great deal of convergence in how regulators understand and try to improve algorithm audits.

Algorithm audits - the present and the future

The DRCF has in parallel published a discussion paper on the current and future landscape of algorithm auditing and the role of regulators.

By way of background, the use of algorithms can involve a lack of transparency and disparity of understanding between creators and the users of the systems, due to unclear governance and often complex inner workings of the decision-making systems.  It increasingly looks like a method of ‘audit’ or ‘assurance’ is needed, in order to review algorithmic systems against their intended purpose(s), ‘close the information gap’ and build trust and ensure that no harm is being caused as a result. 

We have written about the role of AI auditing and assurance as part of the UK National AI Strategy and as part of innovation in AI. One of the three ‘pillars’ of the UK National AI Strategy is the development of pro-innovation governance and regulation of AI, aimed at supporting scientists, researchers and entrepreneurs to utilise the benefits of AI, while also protecting consumers and citizens from potential harms.  To this end, additional regulatory oversight of AI systems is expected, following a consultation on the governance of digital technologies published in March 2022.

Different types of algorithm audits

The DRCF paper identifies three distinct types of algorithm audits, the applicability of each being dependent on the context: 

  • Governance audit involves looking at how the AI is used, by whom, where and when
  • Empirical audit means exploring what the AI does by considering inputs and/or outputs
  • Technical audit refers to the mechanics of the AI - looking “under the bonnet” of an algorithm to see how it works.

The DRCF has also identified the creation of standards an alternative method of audit i.e. establishing benchmarks against which systems can be measured, which may then lead to a certification system. For example, the UK's Cabinet Office’s Central Digital and Data Office published a draft algorithmic transparency standard for collecting information about how government uses algorithmic tools which we wrote about last year.

Algorithm audits - where are we now?

By speaking to a range of stakeholders in academia, industry, the public sector and civil society, the DRCF identifies key issues within the current landscape:

  • Lack of effective governance in the ecosystem: there is a lack of clarity about standard that auditors should be auditing against and existing standards are often vague and based on general principles.
  • Insufficient access to systems: A significant barrier identified specifically by academics and researchers is the lack of certainty on the limits of using publicly accessible data or data of technology companies for investigative purposes. Further, the position on legal protection when it comes to the effects of investigative methods is also unclear. For example, those looking into the inner workings of algorithms have come up against threats of legal action or having their accounts disabled on the basis of ‘web scraping’ and creating fake users of systems.
  • Insufficient avenues to seek redress: there is a lack of clear mechanisms for the public or civil society to challenge outputs or decisions made using algorithms.
  • Inconsistency of current audits: the methods used and interpretation of the results of audits can vary widely and may need sector-specific guidance or standards to ensure good outcomes. There may be a need for bespoke approaches to auditing algorithmic systems depending on the context in which they are deployed.

Audits may have high financial costs associated with them, which may mean that larger organisations are better able to adapt and absorb the additional costs than smaller organisations.

Algorithm audits - what does the future hold?

Changing role of auditors

The DRCF recognises the importance of regulators in the future audit landscape to ensure that the application of algorithmic processing systems is trustworthy and legally compliant. Possible roles identified for regulators include:

  • stating when audits should happen to ensure that parties are more likely to comply with the law;
  • establishing standards and best practices on how audits may encourage legal compliance;
  • ensuring action is taken to address harms identified in an audit where appropriate;
  • identifying and tackling misleading claims and practices.

Regulation or self-governance?

The DRCF also recognises self-governance as one possible approach, with both pros and cons:

  • industry bodies and organisations can share knowledge and expertise and work together with academia and industry to develop technical standards and tools to support algorithm audits.
  • however, despite benefits to industry, such as lower regulatory costs and more flexibility as to how audits are done, progress may be slow depending on how effectively those stakeholders work with each other.

A middle ground?

Another approach identified by the DRCF is a ‘halfway’ system between a regulator- and industry-led approach.

  • Regulators can work with standards bodies to develop specific criteria, guidance and standards against which algorithms can be audited. A system of regulator-backed certification (rather than self-certification) would provide consumers with more information when making decisions between products, and may allow for consumer-led challenges to an organisation’s certification.
  • Regulators could have a role in creating or supporting the standards by which an audit is conducted, including minimum transparency requirements.
  • There may be a role for regulators to require organisations using algorithmic systems to publish a ‘notice of use’ in order to make consumers and affected parties aware of their use of algorithmic decision-making.
  • Option for regulators to create a confidential database through which auditors could share the results of audits with regulators. This may help to build an evidence base for enforcement investigations.
  • Regulators to encourage and incentivise auditing providers to seek out new issues and harms, rather than focussing solely on existing harm.

One of the next steps for the DRCF is a call for Input to help them test ideas and identify further areas of common interest for the future.

The DRCF suggests the following 6 roles that regulators may play in the future and invites feedback on these options:

  1. to clarify how external audit could support the regulatory process, for example, as a means for those developing and deploying algorithms to demonstrate compliance with regulation, under conditions approved by the regulator.
  2. to produce guidance on how third parties should conduct audits and how they should communicate their results to demonstrate compliance with our respective regimes.
  3. to assist standards-setting authorities to convert regulatory requirements into testable criteria for audit.
  4. to provide mechanisms through which internal and external auditors, the public and civil society bodies can securely share information with regulators to create an evidence base for emerging harms. Such mechanisms could include a confidential database for voluntary information sharing with regulators.
  5. to accredit organisations to carry out audits, and in some cases these organisations may certify that systems are being used in an appropriate way (for example, through a bias audit) in order to demonstrate compliance with the law to a regulator.
  6. to expand the use of regulatory sandboxes (where a regulator has power to do so) to test algorithmic systems in a controlled environment.

The DRCF will be accepting responses to the questions until Wednesday 8th June 2022, after which a summary of responses will be published.

If you would like to discuss the potential of the AI auditing and assurance, please contact Tom Whittaker or David Varney.

This article was written by Marija Nonkovic and Tom Whittaker.