Algorithmic Transparency Recording Standard

Created By:  thumbnail Kate Cooper
Last updated: 31 Jul 2023
Tool

It can be difficult to provide information on the use of data-driven technologies in a trusted, accessible, effective way to the public.
 
The Algorithmic Transparency Recording Standard (ATRS) establishes a standardised way for public sector organisations to proactively and transparently publish information about how and why they are using algorithmic approaches or tools in decision making. Visit the website or get in touch with CDEI to discuss how your local authority might use the ATRS (algorithmic.transparency@cdei.gov.uk). You can also see some example use cases on the website. 
 
Why?

Being transparent enables teams to:

•    Build the right foundations - Embedding the ATRS into your governance processes compliments processes that are already in place, and builds the strong foundations you need to responsibly deliver better outcomes. It helps you ask the right ethical questions when building or procuring algorithmic tools and drives quality.
•    Improve public trust - Transparency builds trust and improves engagement with the public. The ATRS allows you to tell your own story about how you are using a tool, why, and how it fits in your decision-making processes. It increases accountability for public sector decisions.
•    Enable innovation - Transparency supports innovation in organisations, helping senior leaders engage with how their teams are using AI, and facilitating best practice being shared across organisations. The ATRS helps teams do both these things more consistently, and takes the burden off individuals to be transparent on their own. 

Which tools does this apply to?

The Algorithmic Transparency Recording Standard is most relevant for algorithmic tools that either:

•       have a significant influence on a decision-making process with direct or indirect public effect, or
•       directly interact with the general public.

To decide whether your tool has a public effect, you might want to consider whether usage of the tool could:

•       materially affect individuals, organisations or groups
•       have a legal, economic, or similar impact on individuals, organisations or groups
•       affect procedural or substantive rights
•       impact eligibility for, receipt of, or denial of a programme

Examples of tools that could fall within the scope of these criteria are:

•       a machine learning algorithm providing members of the public with a score to help a government department determine their eligibility for benefits (impact on decision making and public effect)
•       a chatbot on a local authority’s website interacting directly with the public which responds to individual queries and directs members of the public to appropriate content on the website (Direct interaction with the public)
 

Category: Data maturity Data maturity » Data lifecycle Data maturity » Governance and compliance Good practice