The killing of mostly Asian women at a pair of Georgia massage parlors was a deadly escalation of the taunts, slurs, and put-downs that members of Northeastern’s Asian American community have endured. A law professor was out walking her dog when she encountered discrimination up close. “It almost brought me to tears,” she remembers. “Emotions from these micro-aggressions never completely go away.”
Researchers are working to program facial-recognition algorithms with datasets that represent all racial and ethnic groups fairly. But that process may actually be perpetuating biases, according to new research by Zaid Khan, a PhD student in computer engineering at Northeastern University, and his advisor, electrical and computer engineering professor Raymond Fu.