Facial recognition technology ‘will turn our streets into police line-ups’, campaigners say

25 comments
  1. And it’s worse if you’re not white [as the system has worse accuracy with subjects who are female, Black, and 18-30 years old](https://sitn.hms.harvard.edu/flash/2020/racial-discrimination-in-face-recognition-technology/).

    > Face recognition algorithms boast high classification accuracy (over 90%), but these outcomes are not universal. A growing body of research exposes divergent error rates across demographic groups, with the poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old.

  2. Any reliance on the technology for mundane matters will be short lived. I guarantee the police will fail to check a suspects claim of an alibi thoroughly and the case will be laughed out of court because they were not even in the country or were in prison at the time.

  3. OK, let’s break down the rubbish from the article.

    > Campaigners have warned that new guidance from the College of Policing on the use of facial recognition technology means victims of crimes and potential witnesses could be placed on police watchlists.

    **could**.

    That’s a vey big **could**. They **could** also put your picture on Mars.

    > They claim it could mean people with mental health problems are placed on a list **if sought by police**

    Yes, that’s how being sought by the police works. Having a mental health issue should not be a get out clause that denies the police the ability to find you.

    > and now we see that this new policy specifically allows innocent people to be put on facial recognition watchlists”.

    Everyone is innocent until conviction. That should not stop the police from being able to search for suspects.

    > “This includes victims, potential witnesses, people with mental health problems, or possible friends of any of those people

    You’re mad they might want to identify victims? Or people who can help catch criminals? Or those with mental health issues who have been reported as attempting to do themselves harm?

    Would you rather the police just ignored those people?

    > She added that the government “should ban live facial recognition until it has properly considered the extraordinary risks it poses to rights and freedoms in Britain”.

    Facial recognition is no different than a staffed CCTV station. The only difference is faster identification. If you want to remove all CCTV staff then that would be ridiculous.

    > Emmanuelle Andrews, policy and campaigns manager at Liberty, a campaigning organisation, said the guidance “does not solve the underlying problem that facial recognition technology does not make people safer”.

    Rubbish.

    If you know for a fact that you are wanted by the police and you see signs in an area that say “Facial recognition in operation in this area”, are you MORE or LESS likely to try to commit a crime in that area?

  4. > They claim it could mean people with mental health problems are placed on a list **if sought by police**.

    What kind of nonsense is this? I mean…. what do they want?

    That the police should have a list of people they’re looking for but that they should be taken off that list if they have “mental health problems”?

    I don’t expect this technology to work perfectly at present but I can certainly see how it could be very effective if it does.

  5. We’re quickly becoming a “guilty until proven innocent” society. Not too dissimilar from China, although a lot more subtle in its execution. In some ways we’re the most surveilled society in the world when you look at CCTV cameras and the internet dragnet that Snowden exposed to the world.

    Unfortunately everyone vomiting their lives on social media largely subscribe to that bullshit fallacy you hear everywhere, “nothing to hide, nothing to fear”, when what they really mean is “I don’t value my privacy and will happily trade it for the false feeling of ‘security’ the government promises and thosevregular microdoses of dopamine when someone likes my instaposts”.

  6. I don’t necessarily have a problem with this, although I think the government needs to act quickly to ensure that this technology can’t be sold to private companies to use for nefarious means.

    People seem fine recording everyone else with their ring doorbells, but have an issue with the council, police, etc. doing the same thing.

    I also think this will end up being seen in a similar way to lie detectors. If the accuracy is only 70%, for example, how can that be used as evidence?

  7. This is just going to be more and more the norm.

    Andy Burnham in Manchester has royally ballsed up the introduction of a Clean Air Zone and is now back pedalling it. However there are now ANPR cameras everywhere as a part of the infrastructure.

    Rather than take them down he is consulting with Greater Manchester Police so they can use it for tracking people. Nobody voted for it, nobody asked for it – I am sure a lot of people oppose it, I certainly do.

  8. So much of the last decade was just a warm-up to an eventual wider usage, the ‘fascism’ that is now growing was increasingly being applied to people at the lower socio-economic levels.

    An oblivious public living in a crumbling society is how I saw the last decade. People are now sort-of aware that things are bad, but the choice presented in 2017/19 is gone and unlikely to return soon.

  9. I personally don’t have an issue with this, if it is used as a tool and used properly and helps find offenders I am all for it.

    I’m more worried about the tech being abused or misused rather than the tech itself as we have seen many examples on why not everyone can be trusted

  10. I feel commentaries that are presented like this are just done to inflame an emotional debate about dystopian society rather than the technology in question.

    Everything here is dependent on subjective variables we don’t know. Is the tech implemented at 90% correct identification? 95, 99.99? What is the criminal procedure from there? Is it abusive or is the evidence used alongside other lines of inquiry?

    This could be extremely effective and support criminal prevention and prosecution…or it could be a dystopian nightmare. People are just jumping on the extremes of the argument. Until we really understand the full process this article really doesn’t fit reality.

  11. This tech is not accurate and probably going to be a huge waste of money

    AI is not bias free. This will go same bad science way lie detectors went. Likely see a number of false convictions that’ll be expensive and timely for the victim to overturn.

    We’re still spending more money on reacting to crime than spending money on preventing causes of some crime.

  12. Just wait til this is tied to your social media, vaccine passport and digital wallet.

    Be able to fine people as soon as they step outside the house without a mask, or a booster, or for tweeting dangerously in a public place.

    The past 2yrs tells me that most the country will welcome facial recognition tech in the streets, and will gleefully say if you have nothing to hide you have nothing to worry.

    Right now is a dystopia, the future is only worse.

  13. I don’t think I’ve ever heard a genuine argument against the use of facial recognition. I certainly think it will help catch more people that need to be caught.

    And if you’re not wanted by the police what’s the harm?

  14. Basic software was loaded into sub-postmasters’ computer systems, it erroneously showed 736 people were stealing money, but after years of campaigning, the authorities/CPS/courts refused to believe the technology could be wrong.

  15. I have had 3 different AI systems identify me as Nick Frost. Guess I’ll go rob a shop and get him arrested. Peg can bail him out.

Leave a Reply