S.E. Williams
Contributor

Research has removed all doubt that racial bias exists regarding people of color in facial technology. In response, California Senator Kamala Harris has now challenged government agencies to look more closely at this technology especially regarding its use in hiring practices.

The software is more accurate when identifying and analyzing white males over Black males, nearly 35 percent of the images of Black women were wrongly identified as men according to a study by the MIT Media Lab.

Not only do such biases affect employment opportunities, the same flaws exist in facial recognition technology used by police agencies and result in the misidentification of suspects.

The same or similar biases exist regardless of the vendor providing the technology whether its IBM, Microsoft or any other provider of this technology. Most are not surprised to find existing human biases were built into this technology.

In response, California Senator Kamala Harris has called upon the Federal Bureau of Investigation, Federal Trade Commission, and Equal Employment Opportunity Commission to address bias and other problems with facial recognition.

Harris, with the support of other Democratic Senators, has also called on EEOC to investigate the technology’s compliance with current laws, as well as any complaints about it and to determine how the agency can test and monitor its use. In a letter to the EEOC the senators wrote, “We request that the EEOC develop guidelines for employers on the fair use of facial analysis technologies and how this technology may violate anti-discrimination law.”

Prompted by growing complaints, the positive news is that tech companies have already begun efforts to address the existing biases in their technology.