At this point, we’re probably all familiar with algorithms thanks to social media, and perhaps even the idea of targeted advertising—but how are they adding to the marginalisation and oppression of POC? Safiya Umoji Noble, associate professor at UCLA, states in her book, Algorithms of Oppression, that her research points to both racist and sexist bias within digital media platforms. The public expect these platforms to be credible, fair, objective, neutral; but are they? Computer scientist and digital activist, Joy Buolamwini, founder of the Algorithmic Justice League and writer of Gender Shades, found that data-centric technologies were vulnerable to bias. She explains that if we fail to make ethical inclusive artificial intelligence, we risk losing the civil rights and gender equity made under the guise of machine neutrality.
That’s why the newest launch from Google has made us believe that we’re finally heading in the right direction for internet neutrality and POC inclusivity. Skintone.google improves skin tone evaluation in machine learning, allowing for more accurate and broader-scoping AI recognition for algorithms and other digital media platforms. This became possible with the Monk Skin Tone (MST) Scale, courtesy of Dr. Ellis Monk—an Associate Professor of Sociology at Harvard University whose research focuses on social inequalities with respect to race and ethnicity.
Similar to the 6-shade Fitzpatrick Scale beloved by dermatologists and skin experts across the globe, the 10-shade MST also focuses on skin UV exposure along with geographic factors to categorise skin tones. However, where the Fitzpatrick Scale outlines skin characteristics in a medical context, the MST offers an alternative measure designed for developing products that reference and use skin tone shades. Freely available on their website, they provide a unique code of each shade's colour format, which are even used by the National Institute of Health (NIH) and the University of Chicago’s National Opinion Research Center.
Google believes more can and should be done in the social science and computer science disciplines to better represent POC, as Joy and Safiya advocated. Tech giants are finally recognizing that so many products out there just don’t work for darker skin tones, as seen in the beauty industry with Caucasian-centric skincare. And it’s in this disparity that systemic racism becomes reinforced.
So, what does the future hold for this AI technology?
We’re so happy that moves have been made to improve the current industry standard, but there’s more work to be done. Much more. Along with partnering with experts across industries, Google hopes to expand the evaluation of the Monk Skin Tone Scale globally, ensuring inclusive representation across the globe. Not only that, but their mission also includes tuning their camera models and algorithms to more accurately capture nuances of skin tones.
“People think of algorithms as simply a mathematical formulation, but in fact algorithms are really about automated decisions.” — Safiya Umoja Noble.
Going forward, companies must do better in accurately recognising traits such as gender and skin tone in AI technology. Especially as these AI generating systems, algorithms, and predictive systems majorly influence who is hired, who can be granted a loan, and who can digitally see what. Let’s celebrate this win for diversity and inclusivity within our melanin-rich community; we can’t wait to see what the future holds.
Back to 4.5.6 Talks