Five things to know about the Xbox One X
Research from M.I.T. Media Lab revealed that the accuracy of facial recognition systems depends on the gender and skin color of its user. The technology may not be intentionally biased, but the discrepancy arises from its data sets and creators.
( Elijah Nouvelage | Getty Images )
The accuracy of facial recognition systems has showed improvement over the past few years, but apparently, this only applies when the users of the technology are white men.
According to new research, facial recognition technology is currently biased toward white men, and is not as reliable when the users are dark skinned women.
Facial Recognition Biased Toward White Men
Facial recognition systems are 99 percent accurate in determining gender when the user is a white man. This level of accuracy has partly resulted in the increased confidence that the technology is both efficient and secure enough to eventually replace fingerprint recognition systems.
However, when the user is a white woman, the accuracy percentage for identifying gender dropped to 93 percent. It falls further to 88 percent accuracy for a darker-skinned man, and even lower to 65 percent for a darker-skinned woman.
These results, taken from research by M.I.T. Media Lab’s Joy Buolamwini, confirmed previous allegations that facial recognition technology is currently biased toward white men. This is because of the data sets provided to the systems and the condition in which the algorithms were created.
Why Is There A Discrepancy In Facial Recognition Accuracy?
Buolamwini, as written in the paper about her findings that was coauthored by Microsoft researcher Timnit Gebru, built a dataset of 1,270 faces. The faces are of politicians, selected based on the rankings of their respective countries in gender parity when it comes to the people serving in public office. Buolamwini then used the dataset to test the accuracy of facial recognition systems created by Microsoft, IBM, and China’s Megvii.
The discrepancies in the results of facial recognition accuracy can be attributed to the lack of diversity among both data sets and the people who create and test the technology. The systems may not be intentionally biased, but when the data being fed into the algorithms are mostly white men and the people working on the technology are white men, the discrepancy should be expected.
The need for diversity has already been previously raised. Reports have revealed that facial recognition algorithms that were developed in Asia better identified the faces of Asian people than white people, while systems created in North America and Europe did the opposite. The research by Buolamwini provides additional evidence to the claims.
In addition, with facial recognition systems becoming more prevalent among commercial products, companies need to make the necessary changes to make sure that the technology maintains reliability for all users around the world.
© 2018 Tech Times, All rights reserved. Do not reproduce without permission.
This Content is Generated from RSS Feeds, if your content is featured and you would like to be removed, please Contact Us With your website address and name of site you wish to be removed from.
You can control what content is distributed in your RSS Feed by using your Website Editor.