Britain’s data watchdog has a warning for big tech and AI companies: ‘We’re watching you’
The Information Commissioner’s Office is turning 40. The head of the UK’s data watchdog tells the Big Issue how it is keeping pace with big tech and the artificial intelligence race
Britain’s data watchdog has warned tech companies not to breach data protections. Credit; canva
Share
Artificial intelligence poses a serious risk to privacy rights, the UK’s data watchdog has warned, telling tech companies who fail to protect users they will face the full force of the law.
When the Information Commissioner’s Office (ICO) was established in 1984, it was intended to provide Brits with access to their personal data held by the government.
In the four decades since it was set up, the watchdog’s role has expanded “massively”, says Information Commissioner John Edwards. With the dawn of social media, the vast majority of people have started voluntarily contributing data that fuels the business models of large companies. Artificial intelligence is trained on these datasets, risking leaks of sensitive information.
“The explosion of technology over the past four decades has changed and challenged our relationship with what is private and what isn’t in ways we could never have imagined,” Edwards told Big Issue. “But as tech has evolved, so too have the protections put in place by the law.”
A digital exhibition – launched today to celebrate the ICO’s 40th anniversary – tells the story of how these rights and protections have evolved over the past four decades.
What is the Information Commissioner’s Office?
Most of us have a pretty relaxed relationship with online privacy. You probably don’t read the terms and conditions, for example.
Advertisement
Advertisement
In fact, you couldn’t – even if you wanted to. It would take the standard internet user 76 eight-hour work days to read all of the privacy policies they come across in just one year. So you blithely hit ‘accept’, hoping nothing horrifying is hidden in the fine print.
In 2023 Black Mirror episode Joan is Awful, the protagonist does just that – and discovers that she has unwittingly given a streaming platform consent to broadcast an AI-generated version of her daily life around the world.
It’s the stuff of 21st century nightmares. But you don’t have to throw your laptop in a lake, said Edwards.
“We wouldn’t accept that,” he told Big Issue. “That’s not a legitimate thing to bury in a multi-page terms and conditions.
“The ICO is here so you don’t need to know everything about artificial intelligence. We are acting as your agent, so that when you click ‘accept’, you can do so with confidence that there’s an organisation that’s got your back. And that’s us.”
The ICO was originally founded to oversee the data protection act, which gives you the right to access your personal information held by others.
Advertisement
Initially, this mostly applied to data held by the government – In the 1980s, Edwards says, there was a lot of concern about “governments spying on people,” says Edwards. And indeed, information rights have been at the heart of several British political scandals.
In 2007, for example, HMRC lost two CDs containing the personal information of all UK families claiming child benefit – a breach impacting 25 million people. And in 2022, the ICO slammed the police for “excessive intrusion into irrelevant and deeply personal data of rape complainants” – methods that amounted to a “digital strip search”.
But the ICO’s role has also expanded to data breaches by private entities.
In 2011, it emerged that the News of the World newspaper had hacked the phones of celebrities, politicians and members of the public. The ICO played a key role in the subsequent inquiry and created a journalism code of practice.
The role of the regulator is constantly evolving. Tech companies are now the most significant players in the privacy landscape.
“When you carry a cell phone, you’re carrying a tracking device, you’re carrying a publishing device, you’re carrying all your health records, all your financial records, all your work records,” Edwards said.
Advertisement
“[They use the personal data] to sell us stuff. The more data they have, the better they can target us… not just sell us products, but sell us ideas.”
This became clear after the 2017 Cambridge Analytica scandal, which found that political parties were using detailed pictures of our online lives to target small groups of voters with specific adverts. An ICO investigation into the scandal was the largest of its kind.
With its ability to rapidly analyse vast amounts of personal data, artificial intelligence is further complicating the “threat landscape.”
“With the AI products, there’s a real race for territory there,” said Edwards. “That means that corners can be cut, and that’s where I think the ICO has a real role to step in and say, ‘Hang on a minute’. We recognise that you’re in a hurry, but you’ve still got to understand the information flows, you’ve got to understand the potential risks.”
Advanced algorithms can piece together seemingly unrelated data points to create detailed profiles of individuals, often without their knowledge or consent. Machine learning models can infer sensitive information like health conditions or political preferences from online behaviour.
Artificial intelligence can also behave in ways that we don’t expect, making it hard to regulate. Edwards points to one example where an AI bot – created to “crawl the internet and report back on certain websites”– found that it was being defeated in its task by CAPTCHAS (a type of security measure that tells people and computers apart).
Advertisement
“So the AI posted an ad on a recruitment site asking for someone to work for it to read CAPTCHAs,” Edwards says. “Somebody applied for this job and asked, ‘Why do you need someone to read CAPTCHAs?’ And the AI responded: ‘Because I can’t see.’
“There’s a lot more that we don’t understand about what the risks and threats are than we do.”
Other threats – like ransomware attacks – are also evolving. Compared to the kind of deliberate attacks taking place now, the 2007 loss of millions of people’s data was like a “tree falling in the forest”. It was unintended, and no one saw it or heard it.
“But what we are seeing now are quite sophisticated attacks into the system which exfiltrate large volumes of data and lock it up,” Edwards said. “For example, you’re seeing national health trusts succumbing to ransomware attacks where they cannot access their data, and that has a really significant impact for people.”
With this kind of threat – along with the multitude of long, confusing privacy agreements you’re required to sign just to access a website – can make you feel extremely vulnerable.
Nonetheless, Edwards is clear: you have legal protections against data misuse – and the ICO will work to enforce them.
Advertisement
“We hear the phrase ‘wild west’ a bit. But I think the people who call it a wild west are the ones whose interests it would be for it to be,” he said. “It is not a wild west. It is a highly regulated environment… there are rules, there are guardrails, and [companies] have got to comply with them – and we will be keeping an eye on them.”
Do you have a story to tell or opinions to share about this? Get in touch and tell us more. This Christmas, you can make a lasting change on a vendor’s life. Buy a magazine from your local vendor in the street every week. If you can’t reach them, buy a Vendor Support Kit.
Big Issue is demanding an end to extreme poverty. Will you ask your MP to join us?
This Christmas, 3.8 million people across the UK will be facing extreme poverty. Thousands of those struggling will turn to selling the Big Issue as a vital source of income - they need your support to earn and lift themselves out of poverty.