When police in London recently trialled a new facial recognition system, they made a worrying and embarrassing mistake. At the Notting Hill Carnival, the technology made roughly 35 false matches between known suspects and members of the crowd, with one person “erroneously” arrested. Camera-based visual surveillance systems were supposed to deliver a safer and more secure society. But despite decades of development, they are generally not able to handle real-life situations. During the 2011 London riots, for example, facial recognition software contributed to just one arrest out of the 4,962 that took place. The failure of this technology means visual…

This story continues at The Next Web