Google will adapt its search engine to the skin color of Internet users

The company has presented a palette of ten skin tones that will allow better representation of search results, especially on Google Images. Today we launched the Monk Skin Tone (MST) Scale in partnership with Harvard professor and sociologist Dr. Ellis Monk. The MST scale, developed by Dr. Monk, is a 10-shade scale designed to better take into account the range of skin colors in our society. “We’ll be integrating the MST Scale into various Google products over the next few months and making it public so everyone can use it for research and product development,” said Molly McHugh-Johnson, Staff Writer at Google.

The MST scale is an important step in the collective effort to improve the integration of skin color into technology. For Google, it will help advance the commitment to image fairness and improved representation in Google products. By making MST scaling available to everyone, Google hopes others can do the same.

Considering skin color equity in technology is an interesting challenge for research because it is not just a technical issue, but also a social issue.

Progress requires the combined expertise of a wide range of people: from social science scholars who have spent years studying social inequality and skin color stratification as part of their research, to users of products and technologies, who provide the necessary nuances and insights from their lived experiences, to the theicists and civil rights activists, who guide application frameworks to ensure that we preserve and honor social nuances.

Google teams have been contributing to this body of work for years now. Here’s a more in-depth look at how Google’s teams have thought about and worked on skin color representation efforts, particularly as they relate to the MST scale and what could come next.

Persistent inequalities exist globally due to prejudice or discrimination against people with darker skin color, also known as colorism, says social psychologist and user experience researcher Dr Courtney Heldreth. (UX) within Google’s Responsible AI Human-Centered Technology UX (RAI-HCT UX) department, which is part of Google Research. Academic literature demonstrates that skin color plays an important role in how people are treated in a wide variety of outcomes, including health, wealth, well-being, and more.

Machine learning, a type of AI, is the basis of many products we use every day. Cameras use machine learning for security reasons, to unlock a phone or register someone at the door. Machine learning can classify photos based on similar faces or adjust the brightness of an image.

To do well, engineers and researchers need diverse sets of training data to train models and to thoroughly test the resulting models on a wide range of images. Importantly, in order for the datasets used to develop technologies related to understanding people to be more inclusive, Google needs a scale that represents a wide range of skin tones.

If you say: I tested my fairness model to make sure it works well for dark skin tones, but you’re using a scale that doesn’t represent most people with those skin tones, you don’t know how well it actually works,” says Xango Eye, a product manager working on Responsible AI.

If not developed with intent, the measurement of skin tone that we use to understand whether our models are accurate and representative can affect how products are experienced by users. Downstream, these decisions can have the greatest impacts on those most vulnerable to unfair treatment, people with darker skin,” says Dr Heldreth.

Eye and Dr. Heldreth are both core members of Google’s research efforts to incorporate greater skin color fairness into AI development, a group that includes an interdisciplinary mix of product managers, researchers and engineers specializing in computer vision and social psychology. The team also works with Google’s Image Fairness teams to build representation in products like cameras, photos, and emojis.

We take a human-centric approach to understanding how AI can influence and help people around the world, says Dr Heldreth, focusing on improving inclusiveness in AI, to make so that technology reflects and empowers globally and culturally diverse communities, especially those that are historically marginalized and underserved. A more inclusive skin tone scale is a central part of this effort.

The team operates with one overarching goal: To keep improving technology so it works well for more people. To do this, two major tasks were accomplished: The first was to determine what was already being built and why it was not working, explains Eye. And the second was figuring out what we should build instead.

A socio-technical approach

Skin tone is one thing that changes the physical properties of images and affects people’s experiences, and both of these things can impact the performance of a piece of technology, says Dr. Susanna Ricco. Dr. Ricco, a software engineer with Google Research’s Perception team, leads a group that researches new ways to make sure Google’s computer vision systems work well for more users. , regardless of their background or appearance.

To make sure this technology works for all skin tones, we need to intentionally test and improve it across a diverse range. To do this, we need a scale that doesn’t ignore skin color and isn’t too general,” she explains.

There’s the physical side of things, namely how a sensor reacts to the color of a person’s skin, says Dr. Ricco. Then there’s the social side of things: We know there’s a correlation between skin color and life experiences, so we want to make sure we’re looking at equity from that perspective as well. . In the end, what matters is whether it works for me? – and not just me, the person who creates this technology, but me, that is to say anyone who is confronted with it.

Developing a scale for this is not just an AI or technology problem, but a socio-technical problem, adds Dr Heldreth. It is important that we understand how unevenness in skin color can manifest itself in the technology we use and, importantly, that we do our best to avoid duplicating the colorism that exists. Equity is contextual and uniquely experienced by each individual, so it is important to focus this issue on the people who will ultimately be affected by the choices we make. Therefore, to do it right, we need to take a human-centered approach, because this is a human problem.

The skin tone scale

Dr. Monk undertook this research in part to rely on the most widely used skin tone scale, the Fitzpatrick scale. Created in 1975 and composed of six major shades, it was to serve as a starting point for a medical categorization of skin type. The tech industry has widely adopted and applied it to skin tones, and it has become the norm. This is what most artificial intelligence systems use to measure skin tone. By comparison, the MST scale is made up of 10 shades, a number chosen not to be too limiting, but also not too complex. It’s not just about that precise numerical value of skin tone. It’s about giving people something they can identify with.

Together, the team and Dr. Monk surveyed thousands of adults in the United States to find out if people felt better represented by the MST scale compared to other scales that have been used in health sectors. machine learning and beauty. Overall, people felt they were better represented by the MST scale than by the Fitzpatrick scale, says Eye, and this was especially true for less-represented demographic groups.

What you’re looking for is that subjective moment where people can see their skin color on the scale, says Dr Heldreth. Seeing the results of our research demonstrating that there are other measures of skin tone where more people see themselves better represented made us feel like we were taking steps in the right direction, that we really could make a difference. .

Of course, 10 points are not as complete as scales that have 16, 40 or 110 shades. And for many use cases, like makeup, more is better. What was exciting about the results of the MST scale survey was that the team found that, even with 10 shades, participants felt the scale was as representative as the scales of the beauty industry with greater variety.

They felt that the MST scale was just as inclusive, even though it only had 10 points,” says Eye. A 10-point scale can also be used when annotating data, whereas evaluating skin tone images using a 40-point scale would be an almost impossible task for assessors to perform reliably. .

What is particularly interesting about this work is that it continues to emphasize the importance of a socio-technical approach to creating more equitable tools and products. Skin tones are continuous and can be defined and categorized in many different ways, the simplest being to choose equidistant RGB values ​​on a scale from light brown to dark brown.

But such a technical approach ignores the nuance of how different communities have historically been affected by colorism. An effective scale for measuring and reducing inconsistent experiences for a larger number of people must adequately reflect a wide range of skin tones that represent a diversity of communities. This is where Dr. Monk’s expertise and research prove particularly valuable.

One of the first areas where this technology will be used is in Google’s image-based products. So far, Google has relied heavily on the Fitzpatrick Scale for photo AI. The MST scale is now integrated into products such as Google Photos and Image Search, and will be expanded even more widely in the coming months.

Source: Google

And you?

What is your opinion on the subject?

See as well :

Google Shut Down My Account For Sharing Historical Records They Called ‘Terrorist Activity’, Editor Says It Risks Losing Years Of Research

Google wants to work with the Pentagon again, despite employee concerns. The company reportedly offered to be a military cloud provider

Google unveils Logica, a new open-source logic programming language that compiles SQL and can run on Google BigQuery

Google is being sued for secretly collecting data from Android users through hidden and unapproved transmissions to its servers

Leave a Comment