Illustration: Laurence Ware
*Names have been changed
In March last year, a 14-year-old boy was the victim of a serious assault outside his school in east Bristol. After being punched in the face by a man police described as in his thirties, the teenager suffered a seizure and was rushed to hospital.
There were witnesses and CCTV footage in the area, so the boy’s mother, Naila*, assumed his attacker would be caught. But the criminal investigation into the incident has since been dropped, and Naila says officers treated her son like the perpetrator, and not the victim, of a crime.
“He was referred to social services by the police who said he was at risk of exploitation and suspected he was being initiated into a gang,” she tells the Cable. “Where has this information come from, that they decided to not search for the suspect but turn the spotlight on our family?”
Anxious to find out how officers reached their conclusions, Naila asked the city council what information they held on her son by submitting Subject Access Requests (SARs), which by law require public institutions to disclose this kind of information.
But getting answers, Naila has since learned, is hard. Her requests marked the start of an ongoing journey into the complex world of data and algorithms used by police, councils, and schools to monitor young people and their families in Bristol – and the Cable has joined her.
Protecting, or profiling?
For more than a decade, Bristol City Council and Avon and Somerset Police have been collaborating on mass data gathering programmes used to profile tens of thousands of people in the city – including 55,000 families on the controversial Think Family database and app.
Most of those who appear on the database don’t even know they’re on it, while campaigners and legal groups have raised concerns about the potential risk of rights violations.
The Think Family database, launched in 2015, combines around 35 datasets from the council, police and other agencies in order to present all information held on individuals in one place. This information is used by schools and social workers through the Think Family Education App (TFE), launched in 2023.
Profiling Power
The Cable is working with investigative and data journalists at Lighthouse Reports, Liberty Investigates and Wired Magazine to bring more information about Think Family to light.
But to find answers we need the support of more people like Naila, who are pushing the city council and police to reveal what data they hold on their children, and those with experience of the programme to share insights
If you’re a parent worried about this issue, or if you’re a teacher or safeguarding lead at a Bristol school with experience of using or knowledge of the Think Family database or app, we want to hear from you.
Contact the Cable’s investigations lead Sean at sean@thebristolcable.org. You can find our tip-off guide here, and our helpful template for Subject Access Requests (SAR) here:
The app combines data to generate risk scores based on things like whether a child or young person has been excluded from school, if they’ve received support for their mental health, if they’re on free school meals, or if they live in an area of social deprivation.
Avon and Somerset Police and Bristol City Council said the programme is designed to protect, not profile people, with the tool giving professionals working with children information to safeguard those at risk of exploitation, “helping them to act swiftly on any identified risks”.
Councillor Christine Townsend, chair of the Children and Young People Committee at Bristol City Council, said the authority does not use “any predictive analytics” apart from those that help to identify children who are at risk of being not in education, employment or training (NEET). “The use of analytics has never replaced professional human judgement or decision-making,” she added.
This was not always the case though.
‘Reproducing racialised ideas’
At least three predictive models were built on top of the Think Family database: the risk of criminal exploitation model, risk of sexual exploitation model, and a model that assesses a young person’s risk of being NEET.
The Cable has learned that the first two of these predictive models, designed by the council’s data team Insight Bristol and Avon and Somerset Police, were quietly shut down in May 2023, a disclosure from the council obtained under Freedom of Information laws shows.
Following an independent review, Bristol City Council concluded that the models were “not fit for operational use”.
The government’s Centre for Data Ethics and Innovation (CDEI) carried out an inspection of Bristol City Council’s data team, Insight Bristol. In a report from 2021 it found that individuals’ profiles on the Think Family database could be flagged in ways that could “lead to bias or other forms of indirect discrimination”.
In its report, the CDEI urged the council to start deliberating with local communities about its approach to digital innovation, adding: “If [Bristol City Council] were not to do so it would put itself at risk… of undermining the public’s trust in the council’s use of data.”
Bristol City Council did not respond to a request for information on any steps it would or has taken to ensure that any impact on children or their families would be addressed. While the models have been shut down, there may be children and their families who were affected by potential biased or discriminatory data baked into the system at the time.
The decommissioning of the predictive models follows criticism from criminal justice and anti-racist organisations that in 2023 called for the database to be closed down. Campaign group Bristol No More Exclusions said at the time: “Technologies that gather and use information in the name of ‘public safety’ overwhelmingly reproduce racialised ideas of problematic behaviour.”
Criminal justice campaign charity Fair Trials reported how staff using the app kept it secret from parents and carers, and admitted that many would be concerned about it if they knew of it. The charity’s senior legal and policy officer said: “Schoolchildren should not be monitored, profiled and criminalised by secret police databases. Surveillance is not safeguarding.”
‘I have the right to know’
Measuring if and how children and young people are criminalised by databases like the Think Family is difficult. As Naila learned when she requested information on what Bristol City Council knows about her son.
When she submitted her SAR to the local authority, it responded by asking that her son personally provide them with a letter of consent from his school email account. They would need him to directly give them the authority to release information about him.
“[They’re] asking a 14-year-old to submit information himself,” she tells the Cable. “I would get it if he was older, but he’s very much a child, has had no support for the assault – he’s not going to go and do that process. This is enough to stop people putting in such requests. I have to advocate for him because who else is going to?”
Eventually, Naila received some information from Bristol City Council on the incident involving her son, but it didn’t show what or how any personal data about him appears on the various systems highlighted in this feature.
With the support of the Cable and its reporting partners, Naila has submitted more targeted SARs to both the city council and Avon and Somerset Police in hope she will find out more.
“I have a right to know what information agencies hold that has shaped the narrative about my child,” Naila said. “Especially after he was violently assaulted and yet was quickly and unfairly labelled as the perpetrator.”
“This information appears to reduce him to nothing more than another young boy from the global majority, viewed through a lens of bias and racism, rather than as the beautiful, funny, charming, cheeky, and helpful boy he truly is.
For Naila, this risk of discrimination is clear: “I believe that the information being held continues to reinforce racist and classist narratives about working-class and global majority children.”
A call to action
Beyond the complex nature of the system, scrutiny is made further difficult by the fact that critics of the Think Family database and app have been silenced by legal threats.
Fair Trials removed an article on the database after a complaint in 2023 from Gary Davies, former Avon and Somerset Police chief superintendent and Head of Early Intervention at Bristol City Council, who the charity described as the creator of the application. (The Guardian, which reported on Fair Trial’s findings, has kept its article online.)
And while parts of the database has been scaled back in Bristol with the decommissioning of the criminal and sexual exploitation models, it’s still running, and the same if not similar programmes could be set up elsewhere in the UK. Davies started his own company called Insight 2 4sight Ltd that he said is supporting “local government and statutory partners to mature their use of data analytics and to improve information sharing across agencies.”
The need for transparency is clear. As Naila puts it: “Without access to this information, we are blindly sending our children into environments where we are told they are safe and that their best interests are central to every agency’s work. I don’t trust this to be true, not just for my son but for all the other children and young people who are added to such apps.”
