string(27) "www.mvpromedia.com/article/" Next Page

Facial recognition technology is “dangerous and inaccurate” says watchdog

Editor Neil Martin highlights the recent concerns over facial recognition technology and wonders how we balance a necessary tool with human rights


The mainstream media is kicking up a fuss about cameras that can spot you in a crowd and decide if you are angry, or happy.

The overall aim of course is to spot someone with malicious intent, say at a public transport hub, or a sports event. If you can spot someone about to place a bomb, or launch an attack, then who would argue about the use of such cameras?

The controversy comes of course, because similar cameras are being used to spot people who are of interest to the police, those that might be wanted for an offence, or have skipped bail. And that would still be okay, if it wasn’t for the fact that many innocent people are being caught up in false arrests say campaigners.

Back in May Big Brother Watch released a report which claimed that facial recognition technology used by the Metropolitan Police at last year’s Notting Hill Carnival was 98% inaccurate, misidentifying 95 people as criminals. It pointed out that the force is planning seven more deployments.

It further claimed that one police force stores photographs of all innocent people incorrectly matched by facial recognition for a year, without their knowledge, resulting in a biometric database of over 2,400 innocent people.

Big Brother Watch’s campaign called on UK public authorities to immediately stop using automated facial recognition software with surveillance cameras. It was backed by David Lammy MP and 15 rights and race equality groups including Article 19, Football Supporters Federation, Index on Censorship, Liberty, Netpol, Police Action Lawyers Group, the Race Equality Foundation and Runnymede Trust.

Silkie Carlo, director of Big Brother Watch, said: “Realtime facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms.

“This has wasted millions in public money and the cost to our civil liberties is too high. It must be dropped.”

It’s a modern day dilemma. We all want to be safe, yet no-one wants the arrests of innocent people. How do we balance the rights of the individual, with the need to spot wrong-doers?

The answer has to be greater accuracy. The technology is not going to go away, it will not be dis-invented. The system providers have to be able to achieve greater success rates and the police are going to have to use it as just one part of their usual procedures to make arrests. And politicians do need to introduce safeguards to ensure that it is not being abused.

But, whether it’s liked, or not, camera recognition technology is here to stay

Get more stories like these Subscribe Sign in