Humans are trying to take bias out of facial recognition programs. It’s not working–yet.
Researchers are working to program facial-recognition algorithms with datasets that represent all racial and ethnic groups fairly. But that process may actually be perpetuating biases, according to new research by Zaid Khan, a PhD student in computer engineering at Northeastern University, and his advisor, electrical and computer engineering professor Raymond Fu.