How comfortable would you feel if you were assigned a score from birth by a computer? It's a reality many of us are already experiencing without even knowing it. UK councils are increasingly turning towards systems that claim to be able to 'predict' social issues.
Missing a day in school, a visit to the hospital, a parent with a chronic illness – all these could lead to you, or your child, being flagged by a database as a potential future problem.
A report by the Data Justice Lab found that at least 53 councils are using the technology. But the real number could be much higher.
Proponents of predictive systems say that this enables councils to better serve their constituents, allowing them to deliver essential services before it's too late – thus saving time, money and even lives.
Isak Nti Asare is an Associate at Oxford Insights and has advised the UK's Open Data Institute.
“Cities have so much data about us as citizens – they know where we live, they know how much money we make, daily schedules…they can see a lot,” he said.
Asare believes that algorithms can improve the council's ability to deliver its services. But he is concerned about the protection of citizens' data, particularly when working with private companies.
“Anything we want to address about algorithms in general is going back to data and it's use and its ownership,” he said.
Since the General Data Protection Regulation (GDPR) was introduced in 2018, more attention has been paid to the safeguarding and protection of people's data. Still, according to law firm DLA Piper, the UK has had one of the largest reported number of data breaches in the EU over the past year.
Another big question mark concerning the use of predictive algorithms in decision-making for Asare is responsibility.
“If Google creates an algorithm that Sunderland is using, and something goes wrong with it, who's responsible?” he asked.
Google's attempt to create an AI ethics board earlier this year, in part to answer such questions, was scrapped one week after launch as employees protested the appointment of a rightwing thinktank leader.
Detractors of predictive algorithms insist that the risks of bias, data protection, and transparency are too great to be ignored. Dr Lina Dencik from the Data Justice Lab sums up what she calls the 'datafied society.'
“There is this logic that you can use data and algorithmically-processed data for the purpose of decision-making which will impact people's ability to participate in society,” she explains.
“So whether someone can cross a border or not, the nature in which we might get hired or fired from our jobs, or even if we go to prison or not, is increasingly being informed by the kind of logic of algorithmic decision-making.”
Councils in the UK have faced a decade of austerity. As pressure mounts to 'do more with less,' some have been sold by the promise of systems that would allow them to make the most of the data that they have, in order to predict social issues before they happen.
Sunderland was one such council.
In 2014, Sunderland council entered into a contract with Palantir, a Silicon-valley tech company. Palantir is one of most controversial – and successful – companies of the last decade.
The CIA-backed surveillance firm was founded by tech billionaire Peter Thiel in the wake of 9/11. Palantir sought to position its technology as the new solution to unravelling 'terrorist plots.'
Its interconnected databases of police departments, government agencies and private companies make Palantir particularly apt at identifying 'social networks', demonstrating links between various families and friends.
The company website claims that Palantir software can assist in uncovering human trafficking rings, finding exploited children, and unraveling complex financial crimes.
Recently, Palantir has been under fire for being one of the main collaborators of the US' Immigration and Customs Enforcement, ICE. In a report by immigration advocacy group Mijente entitled 'The War against Immigrants', Palantir was dubbed the 'tech backbone of ICE.' Their systems were found to be crucial to identifying family networks, enabling ICE in its deportations and incarcerations of migrant parents and children.
Thiel, co-founder of PayPal, has contributed more than $1 million to Donald Trump's campaign, earning himself an office in Trump Tower.
Sources: Multiple news reports
The company has had an office in the UK since 2014. As of this year, it now employs more people in London than at its US headquarters in Palo Alto.
A quick look on jobs website Glassdoor.co.uk reveals dozens of positions advertised for their London office. Software engineers, designers and legal specialists – Palantir is planning to expand its activities in Britain.
According to the Sunday Times, the Cabinet Office has paid Palantir £741,000 for “IT services” since 2015. However, this figure pales in comparison to how much the City of Sunderland has spent on Palantir: over £4.6 million.
In 2014, Sunderland Council entered into a contract with Palantir Technologies to help develop their 'Intelligence Hub'. In a meeting of the Health and Wellbeing Board in September, Councillor and Chair Mel Speding said:
This [is] a new concept which [has] been developed against the background of workforce transformation and budget cuts.
Under austerity, Sunderland has seen its spending power cut by a third, as more than £290m has been taken out of budgets.
“The local authority [has] lost a significant amount of knowledge in recent years and the Intelligence Hub [is] a tool to redress some of that.”
The meeting established that, due to Palantir's role as a 'global leader in data intelligence', the company were an ideal partner to help them achieve their vision.
For the first time, documents obtained by this reporter give insight into how the Palantir systems used by Sunderland Council worked. They were found using advanced Google Search techniques. Alarmingly, although the names are clearly fake – Mary Jones, John Smith – the addresses in the documents are real, raising serious concerns from a data protection point of view.
Citizens are positioned within this 'golden view' not as participants or co-creators, but primarily as (potential) risks, unable to engage with or challenge decisions that govern their lives. Data Justice Lab, 2019
The documents in the slideshow above show database entries for 'Mary Jones' and 'John Smith' as part of Sunderland and Palantir's 'Adult 360' programme. Adult 360 is described as “a project to bring together information about a person and their life from across a number of source systems including Social Care, CES, telecare, intermediate care, city hospitals and the police.”
Palantir systems coordinate different databases to form a complete view of an individual's life in order to deliver “better and more coordinated care.”
This is also called the 'golden view': "A metaphor to understand data systems as part of a desire to have both additional and more integrated information about populations, as well as more granular information about citizens that form the basis of prediction and can drive actions taken"(Data Justice Lab, 2019).
'Seeing' people through data raises serious ethical concerns, as it fundamentally entails taking a reductionist view of the individual from which to then make categories based on a series of characteristics.
These categories then form the basis of future policy, action or decisions made about that individual.
The second slideshow above shows the software used for Sunderland's 'Strengthening Families' programme. The purpose of this tool is to support 'risk-based stratification' against the Government's Troubled Families criteria.
'Troubled Families' are defined by the Government as “those that have problems and cause problems to the community around them, putting high costs on the public sector.”
The first page shows the case file of 'Family 103657' – note the Palantir logo in the left-hand corner. This family has not yet been assigned a 'watcher' – the term used on the system.
The sources for the data are listed at the bottom: Police and Youth Offending services, as well as private company Capita One – one of the largest data providers to local authorities in the UK.
Should the system user wish to 'Take Action', they could click a button with the logo of a gavel to do so – like a judge deciding the fate of a defendant.
On the next page, Palantir's 'social networking' skills are at play, creating a diagram of an individual and their close family – including nieces and nephews.
Below this, a list of incidents.
One entry: “Mam [mother] living with chronic long term condition throughout his life (evidence of not being able to take care of him.)”
“From 10 years old – issues with behaviours, missing episodes and non-school attendances.”
"...achieving good outcomes for all peoples equally."
“We need to root this in questions of social and economic inequality to understand what's actually at stake,” says Dr Dencik from the Data Justice Lab.
"We've had a lot of discussion recently about the extent to which these systems have a tendency to discriminate. In part, because often they are based on skewed datasets, for example, that have been based in discriminatory practices that often get reinforced."
As in the Sunderland 'Strengthening Families' example – the data sources were taken from the Police and Youth Offending services. In other words, people who have already had encounters with these institutions were being profiled as being more 'at risk' – and therefore, more likely to have encounters with those services again.
Campaign group Liberty raised serious concerns earlier this year after finding that 14 police forces were using predictive technology. In their report 'Policing by Machine', they write: “Predictive policing programs entrench pre-existing inequalities while being disguised as cost-effective innovation in a time of austerity – and they put our rights at risk.”
A council staff member working on the Troubled Families programme elsewhere in the country, who wished to remain anonymous, said:
“I don't see a particular issue with collection of data on communities. There are press reports about AI algorithms, chatbots etc. that have built-in bias against minorities and women, but this isn't something we use.”
“We do collect 'monitoring information' on ethnicity, sexuality, disability, armed forces etc., as do most companies, but this is used to ensure we are achieving good outcomes for all peoples equally.”
However, Dr. Dencik is skeptical: "We're not all equally implicated in this, we're not all equally positioned. We don't get monitored in the same way, for the same purposes, as people."
The Data Justice Lab, co-founded by Dr. Dencik and based at Cardiff University, examines the intricate relationship between data and social justice, and highlights the politics and impacts of data-driven processes and big data.
According to their report, at least 53 councils are now using services like the Palantir software to segment and score constituents according to their social group or risk profile – although the real number could be much higher.
Their research showed that not only was this happening across the country, it was often happening in the dark.
Companies like Palantir, Experian, Capita One, and Xantura – among many others – are selling these technologies to local councils with little or no oversight.
In an effort to reveal some of the goings-on, the Data Justice Lab sent Freedom of Information Requests to local authorities across the UK.
They received mixed results from using this method, highlighting some of the challenges of extracting information about data analytics usage from local authorities.
Many councils simply didn't respond (11.1%), while 'Name Pushback' refers to when a Local Authority requested that they provide a 'real name' instead of 'Data Justice Lab'. This despite their attempts to pre-empt this issue by including advice from the Information Commissioner's Office - which supported their action - in the original request.
Most of the FOI requests did not yield results – over half the councils said that they either held no information on these systems or had no such systems. However, this merits clarification – as in many cases, a follow-up is required to determine if in fact there are no data systems in use or if there was an issue with the language used by the Data Justice Lab.
'Commercial Sensitivity' is an important factor, as in certain cases, the request was refused on the basis that the release of information would be against the commercial interests of a private company.
This is particularly troubling, as it is very common for councils to use software belonging to private companies for these purposes.
'Successful' refers to a broad range of responses – from detailed PDFs and charts, to a one-line sentence mentioning the name of a used software.
These examples begin to show how difficult it can be to gain information from local councils: a lack of clarity, council push-back, invoking the protection of trade-secrets. However, as a preliminary piece of research into the status of algorithmic decision-making in the UK, the Data Justice Lab provides an important starting point.
Sunderland Council has decided to end its contract with Palantir in August 2019. After 5 years, and over £4.6 million, what has Sunderland gained?
An official statement from Sunderland City Council stated: “The council had a contract in place with Palantir for assistance in supplying hardware, software, associated licences and professional services to help develop this.”
“The partnership was successful with the aims originally set out fully achieved.”
Palantir was not mentioned in the response Sunderland sent to the Data Justice Lab's FOI request.
Whether you're a concerned citizen or journalist – if you're interested in finding out more about what's going on in your local council, here are some steps you can take.
Advanced Google Search:
Councils produce huge amounts of paperwork, many of which are online – just hard to find. Here are some tips:
The Data Justice Lab have created a searchable database of documents from different sources – click here – try looking up your local council.
Try submitting your own Freedom of Information request to your council using What Do They Know? The website makes it easier to make a request, with helpful tips and information. Take inspiration from the Data Justice Lab's requests (click here for an example) and try making a request of your own. Local authorities need to know that demand accountability.
Finally, follow the CouncilWatch Twitter bot @councildata. Your daily reminder of the councils that are using these systems, as well as news and information about the topic.