Home » Big Brother IS watching you – if you’re poor
Cardiff Conwy Denbighshire featured Gwynedd National News Powys Powys

Big Brother IS watching you – if you’re poor

FIVE Welsh councils are using controversial computer models to gather data on benefits claimants, The Herald can reveal.

Cardiff, Conwy, Denbighshire, Powys and Gwynedd County Councils disclosed their use of computer models to predict benefits claimants’ behaviour and likely conduct in response to Freedom of Information Act requests filed by campaign group Big Brother Watch.

Two of the Councils, Powys and Gwynedd refused to say who provided them with the software they use. Powys also refused to disclose how much it paid for using the software.

Cardiff City Council pays £37,000 a year to use Risk-Based Verification.

Across the UK, 540,000 benefits applicants are secretly assigned fraud risk scores by councils’ algorithms BEFORE they can access housing benefit or council tax support.

Personal data from 1.6 million people living in social housing is processed by commercial algorithms to predict rent non-payers.

250,000+ people’s data is processed by a range of secretive automated tools to predict the likelihood they’ll be abused, become homeless or out of work.

Campaign group Big Brother Watch has filed a complaint with the UK’s data watchdog about local authorities’ use of clients’ data to profile them.

The organisation claims that most of the algorithms it uncovered are “secretive, unevidenced, incredibly invasive and likely discriminatory”.

“AUTOMATED SUSPICION”

Powys County Council: Won’t say what system they use or how much it costs

The privacy campaign group’s long-term investigation has found councils across the UK conducting “mass profiling” and “citizen scoring” of welfare and social care recipients to predict fraud, rent non-payments and major life events.

The campaigners complain that councils are using “tools of automated suspicion” without residents’ knowledge and that the risk scoring algorithms could be “disadvantaging and discriminating against Britain’s poor”.

An algorithm by tech company Xantura, used by two London councils, claimed to predict residents’ risks of negative impacts arising from the coronavirus pandemic and even whether they were likely to break self-isolation rules.

The ‘Covid OneView’ system is built on thousands of pieces of data held by councils, including seemingly irrelevant personal information such as details on people’s sex lives, anger management issues or if they possess a dangerous dog.

“UNEVIDENCED”

Claimants’ data collected and monitored by Councils

Algorithms that assign fraud risk scores to benefits claimants, used by over 50 councils, are set targets to assign 25% of claims as medium risk and 20% as high risk by the Department for Work and Pensions.

However, in documents obtained by Big Brother Watch, some councils found the risk-based verification” algorithm was “yet to realise most of its predicted financial efficiencies” and approximately 30 authorities have dropped the tool in the past 4 years.

One woman in receipt of housing benefit, who sent a formal request for her data to Brent council, said she was “stunned” to find she had been flagged as “medium risk” of fraud. The woman, who wished to remain anonymous, said:

“I wasn’t aware that my council was risk-scoring me and it is disgusting they use so much of my personal data in the model, something I had no idea about.

“I’ve noticed the amount of evidence I’ve been asked for has changed over the years, which makes it really stressful. I’ve been made to go through all my bank statements line by line with an assessor, which made me feel like a criminal. Now I wonder if it’s because a machine decided, for reasons unknown, I could be a fraudster.

“It feels very unjust for people like me in genuine need, to know I’m being scrutinised and not believed over evidence I provide.”

“DISCRIMINATORY”

Big Brother Watch’s report details how the London Borough of Hillingdon’s ‘Project AXIS’, aimed at assessing children’s risk of future criminality, gathers data from police, schools, social care, missing persons, care homes, and even social media, without residents’ knowledge.

The council claims “no piece of information is too small” for the database.

Campaigners warn of similarities to the Metropolitan Police’s controversial Gangs Matrix database which the Information Commissioner found was operating unlawfully by holding data on people who were unconnected to gang activity and disproportionately profiling young, black men.

Big Brother Watch’s long-term investigation involved over 2,000 Freedom of Information requests, covering over 400 local authorities in the UK.

“SECRETIVE SYSTEMS”

Secretive systems: Used behind closed doors

The campaign group says this information should be publicly available, and “secretive systems of digital suspicion should not be used behind closed doors”.

With private companies contracted to supply many public sector algorithms, there is still little detail known about how most of these so-called ‘black box’ algorithms work.

Commercial confidentiality can also mean that individuals rarely know how automated systems could be influencing decisions about their lives.

CALLS FOR TRANSPARENCY

The group is calling for a public register of algorithms that inform decision-making in the public sector, and for authorities to conduct privacy and equality assessments before using predictive tools to mitigate the risks of discrimination.

Such assessments, the group found, were rarely conducted.

Big Brother Watch is encouraging people in receipt of welfare or social care to send Data Subject Access Requests to their council to request their risk scores and has published draft request forms.

The campaigners have also lodged a complaint with the Information Commissioner, calling for an “urgent inquiry to uncover and regulate the Wild West of algorithms impacting our country’s poorest people.”

INVASION OF PRIVACY

Jake Hurfurt, Head of Research and Investigations at Big Brother Watch said: “Our welfare data investigation has uncovered councils using hidden algorithms that are secretive, unevidenced, incredibly invasive and likely discriminatory.

“The scale of the profiling, mass data gathering and digital surveillance that millions of people are unwittingly subjected to is truly shocking. We are deeply concerned that these risk scoring algorithms could be disadvantaging and discriminating against Britain’s poor.

“Unless authorities are transparent and better respect privacy and equality rights, their adoption of these technologies is on a one-way ticket to constant surveillance of the poor.”

Lord Clement-Jones, Chair of the Select Committee on Artificial Intelligence said: “The evidence presented by Big Brother Watch in this report of the widespread secretive collection by public authorities of the personal data of the most vulnerable in society and its use in opaque algorithmic decision-making systems is deeply alarming.

“It increasingly adds up to a surveillance state where data protection and equality rights are ignored.”

Sara Willcocks, Head of External Affairs at the charity Turn2Us, which helps people living in poverty in the UK, said: “This new research by Big Brother Watch has highlighted an incredibly concerning trend where those of us on low incomes are treated with suspicion, bias and discrimination.

“A decade of cuts, caps and freezes to our social security system has left cash-strapped councils relying on outsourced algorithms.

“We urge both the DWP and local authorities to review the findings of this report and ask themselves whether this is an ethical or even practical way to go about their work and to develop a fairer and more compassionate approach to this moving forward.”

Author