Advertisement

Federal study finds race and gender affect face-scanning technology

Facial recognition protest
Demonstrators hide their faces with images of Amazon Chief Executive Jeff Bezos during a protest over the company’s facial recognition system in Seattle in 2018.
(Elaine Thompson / Associated Press)
Share

A study by a U.S. agency has found that facial recognition technology often performs unevenly based on a person’s race, gender or age.

This is the first time the National Institute of Standards and Technology has investigated demographic differences in how face-scanning algorithms are able to identify people.

Lawmakers and privacy advocates have raised concerns about biased results in the commercial face recognition software increasingly used by law enforcement, airports and a variety of businesses.

Advertisement

But the nuanced report published Thursday is unlikely to resolve differences of opinion between critics and defenders of facial recognition. It cautions against “incomplete” previous research alleging biased facial recognition that has alarmed lawmakers but confirms similar trends showing higher error rates for women, the youngest and oldest people, and for certain racial groups.

“There is a wide range of performance and there’s certainly work to be done,” said Craig Watson, manager of the National Institute of Standards and Technology research group that studies biometric technology. “The main message is don’t try to generalize the results across all the technology. Know your use case, the algorithm that’s being used.”

The agency, which is a part of the Commerce Department, tested the algorithms of nearly 100 companies on millions of mugshots, visa application photos and other government-held images. Microsoft was among the major tech companies that voluntarily submitted its technology for review. Amazon, which markets face-scanning software to police, did not.

Advertisement