LDW DataThinks: A careful approach to public sector automation


As part of LOTI’s LDW DataThinks, Anna Dent, freelance research and policy consultant and former Head of Research at Promising Trouble, sets out the key considerations for local authorities looking to integrate automated systems and tools into their operations.

Governments the world over are increasingly looking to automation to realise savings, improve efficiency or provide a more tailored user experience. This might be through back office systems, like speeding up document processing, or public-facing, like a chatbot to help people find information quickly. Local authorities are trialling a whole host of technologies to discover how and where automation is effective and appropriate. From collaborating to discover how best to use chatbots to automating the flow of planning data, there are great examples in development and in use. The drive to embrace automation is understandable, given squeezed budgets, increasing demand and a growing focus on prevention.

However, automation in all its forms, including AI (Artificial Intelligence), algorithms and predictive analytics, cannot be deployed without risk. Good outcomes are not guaranteed, and there is potential for local authorities to be swept up in hype and over-exaggerated claims from the private sector. Getting it wrong can be costly, not only financially, but also in harms to your residents, reputational damage and even legal challenges. 

Local authorities can take a careful approach to automation through taking the following considerations into account when designing, purchasing and deploying automated systems.

Effectiveness: what is actually possible to achieve with automation may not match the promises made by those selling automated systems and tools. Before committing to automating anything, the realistic potential to achieve the results you want needs to be understood. 

A predictive system used by one local authority between 2015 – 2019 was abandoned in part because it had not delivered the expected benefits: a council spokesperson was quoted as saying “the system wasn’t able to provide sufficiently useful insights to justify further investment in the project”. Designed to highlight families at risk who might need additional or early intervention from social workers, the tool failed to live up to expectations, and citizens and local politicians raised concerns about families’ privacy.

Privacy: automated systems rely on data to function. Particularly when they use personal data,  system designers and commissioners must consider whether its use is valid and proportionate to achieve the desired outcomes. How residents are informed of data usage and what rights they have to opt in or out should always be made clear. In 2023 North Ayrshire Council was deemed likely to have breached data protection laws when using facial recognition in schools, partly because of the risks to individual rights and freedoms, as well as not being transparent enough about its use. 

Bias: the risk of bias is one of the better known problems with automated systems. It can be baked into systems through the use of historically biased data, for example policing data may reflect existing racial biases. The A-Level algorithm used in 2020 to predict grades for students affected by the pandemic encoded bias. Grades were based on predictions from previous years, meaning pupils in 2020 could not outperform previous pupils regardless of their actual attainment. The threat of legal action prompted the government to withdraw it, resulting in frustration and delays. 

Control and transparency: The buying in of systems from private sector providers introduces additional challenges for local authorities. Because private companies making and selling systems consider it too commercially sensitive to release details of how they work, local authorities can find it difficult to know whether they are actually functioning correctly. Before the widespread rollout of Universal Credit, many benefits teams were using a Risk Based Verification system to identify potentially fraudulent housing benefit applications. However, some stopped using the system because they were given little information about how the risk scores were arrived at, making them of limited use. 

Trust: Ultimately, the effectiveness of automated systems relies on trust: trust that they will work as advertised and deliver better results, and trust between the public and the organisations using systems. Without trust, residents will resist their data being collected and used, and may avoid engaging with services they really need. Prioritising the building of trust as an essential part of any automated system can help to avoid problems before they occur. 

Local authorities can proactively pre-empt and manage risk through a series of guiding principles, including interrogating the reasons for automation; considering where automation offers the biggest potential gains for least risk; introducing red lines where automation is just too risky to be used; consistently evaluating the impact of automation on residents and services; committing to transparency; upskilling staff to understand more about automation; and involving residents in decisions about automated systems and their impacts. 

Read the full report, Automating Public Services: A careful approach, where the risks which public bodies need to take into account when considering automation are explored and illustrated. The report also sets out seven guiding principles to realise the benefits of automation while also putting human needs at the heart.

London Data Week Responsible AI

Anna Dent
28 August 2024 ·
Skip to content

Join the LOTI conversation


Sign up for our monthly newsletter to get the latest news and updates