DWP’s ‘automation’ of universal credit discriminates against single mums, researchers say
Questions are being raised about the fairness of using automated tools to distribute benefits
by: Adele Walton
10 Jul 2024
Image: Unsplash
Share
Problems with the Department for Work and Pensions’ (DWP) digital universal credit (UC) system are disproportionately impacting working single mothers, raising questions about the fairness of using automated tools to distribute benefits.
Initially rolled out in 2013, UC is a digital-by-default benefits system that uses various automated processes, including machine learning, to determine people’s eligibility for welfare, calculate benefits payments and detect fraud.
Responding to the findings of a recent Freedom of Information (FOI) request – which revealed that nearly half of in-work UC claimants are single parents, the vast majority (just under 90%) of whom are women – researchers say they are concerned about the role of automation within the UC system to entrench discrimination and surveillance in people’s lives.
“A certain subsection of the population – those on low-incomes and with disabilities – are being put under surveillance by systems that simply don’t affect the rest of the population,” explained Morgan Currie, a senior lecturer in data and society at the University of Edinburgh who submitted the FOI request.
She added that the three most common problems with universal credit’s automated processes – including mistakes caused by flawed information about earnings, hardship as a result of delayed childcare reimbursement, and a mismatch between UC calculation dates and paydays – are therefore overwhelmingly affecting working single mothers.
A DWP spokesperson told the Big Issue: “We are committed to reviewing universal credit so it makes work pay and tackles poverty.”
Advertisement
Advertisement
Risks of discrimination against universal credit claimants
In the Department for Work and Pensions’ (DWP) annual report for 2022-2023, the UK’s Comptroller and Auditor General said that when using machine learning to prioritise cases for fraud review, “there is an inherent risk that the algorithms are biased towards selecting claims for review from certain vulnerable people or groups with protected characteristics.”
According to the researchers Big Issue spoke with, the FOI disclosure points to the clear risks of discrimination within UC given the way that algorithms rely on historical information to make predictions about future events, as well as the system’s track record of flawed decision-making.
“Existing biases can be encoded in automated decision making through the data that systems are trained on,” explained Anna Dent, head of research at Careful Trouble.
“If, say, a system to spot fraud is trained on data about historical fraud cases there is a risk that institutional biases will be baked in – if disabled people, people from certain ethnic backgrounds or single parents, for example, have been disproportionately targeted in the past, there will be a higher percentage of fraud cases involving these groups, which will then encode the same bias into any system trained on that data.”
On universal credit’s track record of flawed decision-making, the researchers cited numerous reports of how the DWP’s UC algorithms fail to factor in how often people are paid; a design flaw that frequently leads to over- or underestimation of people’s earnings, which in turn fuels financial insecurity via unpredictable changes in the amount they receive in benefits from month-to-month.
In 2019, for example, four single mothers represented by Leigh Day and the Child Poverty Action Group won a High Court case against the DWP after it found the women were struggling financially because the automated system meant there was no consistency in their monthly payments.
Advertisement
The court specifically noted that if a claimant was paid a day early due to a weekend or bank holiday, the system would detect them as being paid twice in a month and drastically reduce their UC payment; a problem the women’s lawyers said was likely to affect tens of thousands of people.
For those on lower incomes, this inconsistency in payments can mean the difference between being able to afford their living costs and falling into arrears or debt. Research from early 2024 found that for more than half of the households, universal credit payments varied by £400 or more from one month to the next at least once in the year 2022-23.
“[Because] single mums are disproportionately represented among working claimants (around 40%), the flaws with the payment algorithm are more likely to affect this group. What this means is that single working mums are especially likely to encounter income volatility and financial instability due to the means-testing calculation,” Dent said.
Human Rights Watch also reported in 2020 on how the faulty automated design of universal credit processes led to the system over-predicting the earnings of a single mother, resulting in her support being cut by £1,000.
Currie says the frequent cases of flawed automation in government decision making raises questions about the effectiveness of these tools and who they are designed for.
“Universal credit has placed one type of volatility with another. In our Automating Universal Credit study, some claimants encountering flaws and fluctuations in the calculation had to borrow from their employers and visit food banks to get by,” she said. “So even people following UC’s work requirements sometimes found themselves unable to cover household bills and living expenses.”
Advertisement
These risks of discrimination are also mirrored across countries and historically marginalised communities, which are often more vulnerable to harm from algorithmic decision-making processes.
“Especially with the risk profiling for benefit fraud, certain subsection of the population – those on low-incomes and with disabilities – are being put under surveillance by systems that simply don’t affect the rest of the population,” said Dent.
A lack of transparency
Researchers said the issues around algorithmic discrimination are compounded by a lack of transparency, which makes it near-impossible to challenge wrongful or unfair decisions.
They noted, for example, that while various pieces of legislation – including the UK General Data Protection Regulation (GDPR), the Data Protection Act 2018, the Equality Act and the Human Rights Act – contain mechanisms for safeguarding people from algorithmic discrimination, it is hard to challenge instances of automated discrimination without knowledge of when and how the algorithms are being used.
While the automation of government decision-making is a growing trend, public information on how algorithms are being used in decision making is scarce.
Advertisement
In 2021, Big Brother Watch found that 540,000 benefits applicants were secretly being assigned fraud risk scores by councils’ algorithms before they could access housing benefit or council tax support.
While there are only eight algorithms recorded and explained on the government’s Algorithmic Recording Transparency Record, the Public Law Project’s (PLP) Tag Register database was set up in 2023 to show the public details of the secretive algorithms used by multiple government departments and local authorities.
As of October 2023, there are 55 automated tools recorded in the Tag Register that are used by the government to make or inform decisions.
However, according to PLP senior research fellow Caroline Selman, the database has been created through “resource intensive FOI requests, rather than proactive transparency on the part of the DWP”.
She added that “contesting decisions made or supported by AI or ADM systems is one of the most uncertain and untested areas of public law. The current opacity in how and where algorithmic decision making systems are being used is one of the biggest barriers to ensuring accountability in practice.”
This Christmas, 3.8 million people across the UK will be facing extreme poverty. Thousands of those struggling will turn to selling the Big Issue as a vital source of income - they need your support to earn and lift themselves out of poverty.