Google has announced that it is taking additional steps to improve the representation of various skin tones across its products and has implemented a new scale that better shows the spectrum of real-world skin color.
Google has been particularly outspoken about the lack of representation of all skin tones in media and online. Last year, the company launched Real Tone for Pixel which it says is just one example of its efforts to improve the representation of diverse skin tones across Google products.
The MST Scale
To continue that effort, the company has today launched a new skin tone scale that it developed with Harvard professor and sociologist Dr. Ellis Monk who has been studying how skin tone and colorism affect people’s lives for more than 10 years.
Google says that the culmination of his work is known as the Monk Skin Tone (MST) Scale, a 10-shade scale that will be incorporated into Google products over the coming months.
“We’re openly releasing the scale so anyone can use it for research and product development. Our goal is for the scale to support inclusive products and research across the industry — we see this as a chance to share, learn and evolve our work with the help of others,” Google’s Head of Product for Responsible AI and Product Inclusion in Search Tulsee Doshi says.
While the MST Scale does not encompass all possible skin tones, Google argues it is far more representative of the current tech industry standard.
“In our research, we found that a lot of the time people feel they’re lumped into racial categories, but there’s all this heterogeneity with ethnic and racial categories,” Dr. Monk says.
“And many methods of categorization, including past skin tone scales, don’t pay attention to this diversity. That’s where a lack of representation can happen…we need to fine-tune the way we measure things, so people feel represented.”
The MST Scale and Imagery
Google says that by using the MST scale it can better understand representation in imagery so that it can be sure that a product or feature works well across a range of skin tones. The company says this is particularly important when it comes to computer vision, which it says has been found to not perform as well for people with darker skin.
“The MST Scale will help us and the tech industry at large build more representative datasets so we can train and evaluate AI models for fairness, resulting in features and products that work better for everyone — of all skin tones,” Doshi explains. “For example, we use the scale to evaluate and improve the models that detect faces in images.”
Google intends to use the scale to better provide skin tone representation in Google Search results. For example, if someone were to search for makeup-related queries, they would be able to further refine results to show images by skin tone.
Google Photos will also be improved thanks to the integration of the MST Scale. The company introduced a way to improve its auto-enhance feature last year, and this year it is launching a new set of Real Tone filters that are designed to work well across skin tones and evaluated using the MST Scale.
“We worked with a diverse range of renowned image-makers, like Kennedi Carter and Joshua Kissi, who are celebrated for beautiful and accurate depictions of their subjects, to evaluate, test, and build these filters,” Doshi says.
“These new Real Tone filters allow you to choose from a wider assortment of looks and find one that reflects your style. Real Tone filters will be rolling out on Google Photos across Android, iOS, and Web in the coming weeks.”
For those who want more information on the process of implementing the MST Scale, Google has published a detailed blog post that takes a closer look at the research that went into skin tone representation and artificial intelligence.