‘London Bridge facial recognition tech mistook me for a wanted criminal’ says man bringing High Court challenge against the Met Police
by BulkyAccident
‘London Bridge facial recognition tech mistook me for a wanted criminal’ says man bringing High Court challenge against the Met Police
by BulkyAccident
13 comments
Having read the article I really cant see the issue here. I had an expectation that he had been arrested and detained for hours. In reality it was 30 mins and this could clearly easily happen without LFR.
Stopped because he flagged as maybe wanted, refused to give ID for 30 minutes. They offered to compare his fingerprints with those of the wanted person he resembled at the roadside and he refused that too. Then when he finally decided to show ID (which he had with him the whole time) they immediately let him go.
Sounds mostly self-inflicted and now looking to scam the taxpayer out of some compensation, but we’ll see what happens.
It is disturbing how many bootlickers are ok with Police arresting people on flimsy evidence, and essentially for not complying with “papers please”.
I left Britain last year to get an EU passport, but I swear Brits were never THAT much of a bunch of bootlickers to go along with this Fascist like rhetoric, a lot has changed in a year.
To anyone seriously suggesting that this man is at fault for expressing his right to privacy, go read about this man, I think his first name was Adam? Adolf? He was around in Germany a few years back?
People say it’s a non-issue, but facial recognition tech has long been known to misidentified Asian and black faces more than white faces.
Of course, these groups will be told its their responsibility to put up with extra stops and extra scrutiny than their non minority counterparts, with the easy explanation that we have nothing to worry about if we have nothing to hide.
That’s what people have been saying forever when it comes to these disparities, so nothing new there.
However, why should we be expected to just put up with being treated differently because software and algorithms operate and fundamentally biased systems?
The tool seems helpful in general but the guy is a community worker so he knows what he’s talking about when he says human contact is important and the way this tool is used right now might get a negative impact on communities where he is working to prevent youth crime.
I understand that he’d rather not see human police officers being replaced by machines. They sure should be a tool to help the police force but not a replacement. Good if his action pushes the government to think how to improve the use of this new tool. It’s always good to communicate with local experts having on the ground experience.
And this is going to keep happening
Did it really need “facial recognition” to detect that paedophile?
7 false alerts out of 457 stops, according to the story.
People shouldn’t look like criminals then.
The solution is to not look, act, talk, walk, eat, breathe or even be in the same country as criminals.
Simples.
>He said officers asked him for his fingerprints, but he refused, and he was let go only after about 30 minutes, after showing them a photo of his passport.
It’s regrettable that this guy was misidentified, but he had a very easy way to correct this and instead chose to argue for half an hour. LFR has helped catch dangerous criminals and protect the public, including catching a convicted paedophile with a six-year-old girl. Why would we want to abandon a system that does this just to appease people who find intolerable even the most basic level of co-operation with police?
Who could have possibly predicted this.
AI is bias. Facial recognition is bias. Loads of data and research to support this.
I think the real problem here is will this bloke get a false positive everything he walks past one? Would be really crap
Comments are closed.