Universal Credit: How algorithms pose a threat to accessing benefits
They’re hiding in plain sight and could be making the process of claiming benefits inherently unfair. Lee Coppack and Joanna Bates are getting claimants the protection they deserve
by: Lee Coppack and Joanna Bates
14 May 2021
Algorithms could make the process of claiming Universal Credit inherently unfair. Image credit: Geralt / Pixabay
Share
During the Covid-19 pandemic, we clapped for our heroes, our frontline workers and hoped for a better, more empathetic future for everyone.
We witnessed drastic changes in our social interactions. We could not shake hands with anyone, we could not share our space with others and silently, another revolution was fomenting change, with algorithms at its forefront.
It was all too natural to turn to machines when human contact was a potential risk to our lives. Telemedicine, for example, cut out the need for face-to-face consultations. With staff furloughed, working at home or laid off, companies accelerated their plans for automation and artificial intelligence.
As advances in technology disrupt more sectors, it is clear that many jobs will disappear. There will be more new jobs – but they will be different, as the World Economic Forum states: “They will demand the skills needed for this Fourth Industrial Revolution and there is likely to be less direct employment and more contract and short-term work.”
Algorithms are hiding in plain sight in many instances, not just in Universal Credit
Low-waged and vulnerable people will be hardest hit. As the WEF said in its Future of Jobs Report last year: “In the absence of proactive efforts, inequality is likely to be exacerbated by the dual impact of technology and the pandemic recession.”
Advertisement
Advertisement
Many will need to retrain in radically different work environments. Given the speed of these changes, which we are already seeing, we must provide time and resources for those who need them most.
This brings us to Universal Credit, which should provide support during this time of transition, but clearly fails to do so in many cases. One reason for these failures is poorly designed algorithms, the coded instructions that tell a computer programme what sequences of steps to follow to make a decision.
Algorithms are hiding in plain sight in many instances, not just in Universal Credit. Many are valuable and work smoothly but they’re only as good as the minds creating them. Currently, there is no regulatory framework that creates standards for algorithms or indeed, whether the data they are using are fair or representative.
These are not just issues for Universal Credit and public benefits, but for fairness in the new world of work. Organisations are looking to use algorithm-based systems as part of hiring and promotion. If we don’t take more responsibility for these protocols, we will fail to protect people. Without transparency and accountability at all levels of the process, from conception to implementation and beyond, we run a high risk that they will cause harm.
The Just Algorithms Action Group (JAAG), like unions and advocacy groups, are working to counter the injustice and lack of consideration of many modern computer-based systems in their real-world use, and the malign impact they can have on human beings.
JAAG wants to investigate further to identify specific harms caused by the automated government flagship system of Universal Credit. We are looking for people who have suffered from the inadequacies and failures in the Universal Credit system to help build a legal case for change.
This brings us to Universal Credit, which should provide support during this time of transition, but clearly fails to do so in many cases. One reason for these failures is poorly designed algorithms, the coded instructions that tell a computer programme what sequences of steps to follow to make a decision.
Algorithms are hiding in plain sight in many instances, not just in Universal Credit. Many are valuable and work smoothly but they’re only as good as the minds creating them. Currently, there is no regulatory framework that creates standards for algorithms or indeed, whether the data they are using are fair or representative.
These are not just issues for Universal Credit and public benefits, but for fairness in the new world of work. Organisations are looking to use algorithm-based systems as part of hiring and promotion. If we don’t take more responsibility for these protocols, we will fail to protect people. Without transparency and accountability at all levels of the process, from conception to implementation and beyond, we run a high risk that they will cause harm.
The Just Algorithms Action Group (JAAG), like unions and advocacy groups, are working to counter the injustice and lack of consideration of many modern computer-based systems in their real-world use, and the malign impact they can have on human beings.
JAAG wants to investigate further to identify specific harms caused by the automated government flagship system of Universal Credit. We are looking for people who have suffered from the inadequacies and failures in the Universal Credit system to help build a legal case for change.
If you think you may have been affected by an unfair UC decision contact Jo at UC@jaag.info or call 07305 159700. jaag.org.uk
This Christmas, 3.8 million people across the UK will be facing extreme poverty. Thousands of those struggling will turn to selling the Big Issue as a vital source of income - they need your support to earn and lift themselves out of poverty.