{"id":614442,"date":"2025-12-05T19:42:17","date_gmt":"2025-12-05T19:42:17","guid":{"rendered":"https:\/\/www.europesays.com\/uk\/614442\/"},"modified":"2025-12-05T19:42:17","modified_gmt":"2025-12-05T19:42:17","slug":"urgent-clarity-sought-over-racial-bias-in-uk-police-facial-recognition-technology-facial-recognition","status":"publish","type":"post","link":"https:\/\/www.europesays.com\/uk\/614442\/","title":{"rendered":"\u2018Urgent clarity\u2019 sought over racial bias in UK police facial recognition technology | Facial recognition"},"content":{"rendered":"<p class=\"dcr-130mj7b\">The UK\u2019s data protection watchdog has asked the <a href=\"https:\/\/www.theguardian.com\/politics\/home-office\" data-link-name=\"in body link\" data-component=\"auto-linked-tag\" target=\"_blank\" rel=\"noopener\">Home Office<\/a> for \u201curgent clarity\u201d over racial bias in police facial recognition technology before considering its next steps.<\/p>\n<p class=\"dcr-130mj7b\">The Home Office has admitted that the technology was \u201cmore likely to incorrectly include some demographic groups in its search results\u201d, after testing by the National Physical Laboratory (NPL) of its application within the police national database.<\/p>\n<p class=\"dcr-130mj7b\">The report revealed that the technology, which is intended to be used to catch serious offenders, is more likely to incorrectly match black and Asian people than their white counterparts.<\/p>\n<p class=\"dcr-130mj7b\">In a statement responding to the report, Emily Keaney, the deputy commissioner for the Information Commissioner\u2019s Office, said the ICO had asked the Home Office \u201cfor urgent clarity on this matter\u201d in order for the watchdog to \u201cassess the situation and consider our next steps\u201d.<\/p>\n<p class=\"dcr-130mj7b\">The next steps could include enforcement action, including issuing a legally binding order to stop using the technology or fines, as well as working with the Home Office and police to make improvements.<\/p>\n<p class=\"dcr-130mj7b\">Keaney said: \u201cLast week we were made aware of historical bias in the algorithm used by forces across the UK for retrospective facial recognition within the police national database.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWe acknowledge that measures are being taken to address this bias. However, it\u2019s disappointing that we had not previously been told about this, despite regular engagement with the Home Office and police bodies as part of our wider work to hold government and the public sector to account on how data is being used in their services.<\/p>\n<p class=\"dcr-130mj7b\">\u201cWhile we appreciate the valuable role technology can play, public confidence in its use is paramount, and any perception of bias and discrimination can exacerbate mistrust. The ICO is here to support and assist the public sector to get this right.\u201d<\/p>\n<p class=\"dcr-130mj7b\">Police and crime commissioners said publication of the NPL\u2019s finding \u201csheds light on a concerning inbuilt bias\u201d and urged caution over plans for a national expansion, which could include cameras being placed at shopping centres, stadiums and transport hubs, without putting in place adequate safeguards.<\/p>\n<p class=\"dcr-130mj7b\">The findings were released on Thursday, hours after Sarah Jones, the policing minister, had described the technology as the \u201cbiggest breakthrough since DNA matching\u201d.<\/p>\n<p class=\"dcr-130mj7b\">Facial recognition technology scans people\u2019s faces and cross-references the images against watchlists of known or wanted criminals. It can be used while examining live footage of people passing cameras, comparing their faces with those on wanted lists, or to enable officers to target individuals as they walk by mounted cameras.<\/p>\n<p class=\"dcr-130mj7b\">Police officers can also retrospectively run images of suspects through police, passport or immigration databases to identify them and check their backgrounds.<\/p>\n<p class=\"dcr-130mj7b\">Analysts who examined the police national database\u2019s retrospective facial recognition technology tool at a lower setting found that \u201cthe false positive identification rate (FPIR) for white subjects (0.04%) is lower than that for Asian subjects (4.0%) and black subjects (5.5%)\u201d.<\/p>\n<p class=\"dcr-130mj7b\">The testing found that the number of false positives for black women was particularly high. \u201cThe FPIR for black male subjects (0.4%) is lower than that for black female subjects (9.9%),\u201d the report said.<\/p>\n<p class=\"dcr-130mj7b\">Responding to the report, a Home Office spokesperson said the department took the findings \u201cseriously\u201d, and had already taken action, including procuring and testing a new algorithm \u201cwhich has no statistically significant bias\u201d.<\/p>\n<p class=\"dcr-130mj7b\">\u201cGiven the importance of this issue, we have also asked the police inspectorate, alongside the forensic science regulator, to review law enforcement\u2019s use of facial recognition. They will assess the effectiveness of the mitigations, which the National Police Chiefs\u2019 Council supports,\u201d the spokesperson said.<\/p>\n","protected":false},"excerpt":{"rendered":"The UK\u2019s data protection watchdog has asked the Home Office for \u201curgent clarity\u201d over racial bias in police&hellip;\n","protected":false},"author":2,"featured_media":614443,"comment_status":"","ping_status":"","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3,4],"tags":[748,393,4884,1144,712,16,15,1764],"class_list":{"0":"post-614442","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-uk","8":"category-united-kingdom","9":"tag-britain","10":"tag-england","11":"tag-great-britain","12":"tag-northern-ireland","13":"tag-scotland","14":"tag-uk","15":"tag-united-kingdom","16":"tag-wales"},"share_on_mastodon":{"url":"https:\/\/pubeurope.com\/@uk\/115668670597318060","error":""},"_links":{"self":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/614442","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/comments?post=614442"}],"version-history":[{"count":0,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/posts\/614442\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media\/614443"}],"wp:attachment":[{"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/media?parent=614442"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/categories?post=614442"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.europesays.com\/uk\/wp-json\/wp\/v2\/tags?post=614442"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}