Using facial recognition seems pretty seamless, think of your iPhone. Yet, a human face has actually been confused with a toaster, according to a facial recognition technology expert.

If a computer, which is thought to be highly reliable, will confuse a human face for a toaster, what might that mean for facial recognition accuracy when seeking out suspects of crimes? Possibly, not so reliable.

“Obviously, the technology has immense value in promoting societal interests such as efficiency and security but it also represents a threat to some of our individual interests, particularly privacy,” Nessa Lynch, associate professor of law at Victoria University of Wellington, New Zealand. Lynch and other experts are part of a research project that will be completed in mid-2020. The researchers presented some of their findings during a panel recently held at the university.

Some of the very first images used to test data were those of convicted felons in Florida. They had abused meth and had great cheekbones. But, that presented problems when using facial recognition on actual real folk without a meth habit.

The cheekbones are very different than the average person, which can happen when you eat food. Data from such a source was not useful when training a system to recognize normal people, said Rachel Dixon, Privacy and Data Protection Deputy Commissioner at the Office of the Victorian Information Commissioner in Australia.

Companies who sell the technology products often claim they are highly reliable, but Dixon said, often they are reliable because of the environments where they are used, which may be unvarying. And, the systems are tuned for these specific environments.

“…Picking you out walking randomly down the street can be quite challenging. There’s a whole bunch of environmental factors there that go to essentially reducing the confidence level,” Dixon said in a story published on Ideasroom. “None of this is absolute. There is no one-to-one match. And by perturbing an image even a small amount you can make the machine-learning system think the person is a toaster. I’m not joking.”

If a computer recognizes a face, for example, as person of interest in a crime, it is very hard to change that perception, even if it is wrong, because humans have a hard time believing a machine can make a mistake, especially if it has said it is the correct match, Dixon explained.

In the United States, a conservative estimate is that roughly a quarter of all the 18,000 law enforcement agencies have access to facial recognition systems, particularly for the use in investigations. Yet, Georgetown Law Professor Clare Garvie said there are no laws – at the state or federal level – governing its use.

Garvie, a senior associate at the center on privacy and technology at Georgetown said, “As a result, this technology has been implemented largely without transparency to the public, without rules around auditing or public reporting, without rules around who can be subject to a search. As a result, it is not just suspects of a criminal investigation that are the subject of searches. In many jurisdictions, witnesses, victims or anybody associated with a criminal investigation can also be the subject of a search.”

Because there is little reporting and auditing of the use of the technology, it’s unclear if agencies are checking to determine if it’s being misused or if it is actually a helpful and successful tool, Garvie said. Are law enforcement officials “catching the bad guys” or is the use of the technology a waste of money, which she said she suspects it is in some jurisdictions.

Meanwhile, it may come as no surprise to some, those often caught in the crosshairs are from lower socio-economic status or marginalized populations.

In one instance, a person who was ranked 319th for being a likely match based on the algorithmic ranking, was the one police arrested. The police also failed to provide the ranking evidence to the defense lawyers.

In the United Kingdom, the technology has been used extensively and with mixed results by law enforcement and businesses in order to search for people on watch lists, according to Dr. Joe Purshouse from the School of Law at the University of East Anglia in the UK.

“The human rights implications for privacy, freedom of assembly – those are chilling, Purshouse said, adding the marginalized are caught in the middle such as, “Suspects of crime, people of lower socio-economic status who are forced to use public space and rely more heavily on public space than people who have economic advantages, perhaps.”