-
Book Overview & Buying
-
Table Of Contents
-
Feedback & Rating

Responsible AI in the Enterprise
By :

One of the most appalling and alarming examples of facial recognition gone wrong is when African-Americans were classified as gorillas by Google’s facial recognition tool4 in 2015, which raised awareness and exposed the deep flaws in facial recognition technology.
The Gender Shades project is a pioneering research initiative led by computer scientist Joy Buolamwini, which exposes and addresses biases in facial recognition and analysis algorithms with respect to gender and skin tone. The study, published in 2018, scrutinized the performance of commercial facial recognition systems developed by prominent technology companies, such as IBM, Microsoft, and Face++. The groundbreaking discovery revealed that these AI systems exhibited higher error rates in classifying gender for darker-skinned and female faces compared to their lighter-skinned and male counterparts.
Researchers5 evaluated gender classification tools developed by IBM, Microsoft, and Face++ and...
Change the font size
Change margin width
Change background colour