The Wildcat Roar

The Student News Site of Novi High School

The Wildcat Roar

The Wildcat Roar

Image Equity – Is Your Skin Accurately Represented In Your Photos?

The Google Pixels camera uses improved tuning and algorithims to better represent individuals skin tones.
Courtesy of Google
The Google Pixel’s camera uses improved tuning and algorithims to better represent individual’s skin tones.

Cameras. We use them to capture moments of our lives and to share with millions online. We rely on our phone cameras to record the precious images of occasions spent with our friends and family. The question is, have you ever wondered if your camera accurately portrays how you look? 

What is Image Equity?

The saying “the camera cannot lie” is almost as old as photography itself. But it’s never actually been true. People with darker skin tones know that many of their pictures don’t actually look like themselves at all. From your school ID picture making you look orange, to blending in with a darker background and much more, cameras do not always reflect what the eye sees. Image Equity is the term used to describe how accurately cameras capture your skin tone. Cameras have actually always had this skin tone bias and there is a reason for that. 

History of Camera Technology

Modern day camera technology began during the mid-1950s. Kodak, a well-renowned camera company at the time, needed to come up with a film development kit that would consistently represent people’s skin tones accurately. To create this kit, they needed sample pictures that the kit’s algorithm would run on to develop the film. Shirley Page, an employee at Kodak, was chosen to create the original “Shirley Cards” that were used until the 1990’s when more racially-inclusive Shirley Cards were available. Sadly, the more racially-inclusive cards were too late to the market, and new digital cameras in the 1990’s were still sold with the original Shirley Cards algorithm. Smartphones also inherited this skin tone bias until recently, when one company decided to solve this problem once and for all.

Google’s Real Tone Project

One company that is striving to fix skin tone bias is Google. Google is everywhere, from your browser to YouTube, to the Google Pixel phone. According to Google, the phones are designed to “bring hardware and software together, with AI at the center, to help you be more creative and productive”. In a phone camera like the Pixel’s, AI is an important factor for how your photos are processed. The AI algorithm decides whether the picture needs to be brighter or darker, warmer or cooler, and more. Normally, if the AI detects a dark part of the picture, it brightens it up. This is where darker skin tones can lose their accuracy and the only way to fix this problem is to teach the AI what edit is right and what is wrong. The AI algorithm learns similarly to Kodak’s film development kit from the 50s. Google knew that to get the best and most consistent results from the AI, they were going to have to make new Shirley Cards. Google launched its Real Tone project for this specific reason. Google says that “Real Tone is one effort to make photography more inclusive”. Google also knew it was important to get advice from people who know a lot about taking pictures of people with darker skin tones. Thus, they partnered with photographers who specialize in bringing out the rich colors of darker skin tones. 

Along with this, Google has also partnered with Harvard professor, Dr. Ellis Monk. To improve the AI’s understanding of skin tones, Google open-sourced the Monk Skin Tone (MST) Scale. The goal of the MST Scale is to expand from the less diverse Fitzpatrick Scale, developed in 1975, which was more biased towards lighter skin tones. This enables photographers and others who use the MST Scale to have richer and more vibrant colors in images. This MST Scale is currently available in the Google Photos app as filters to put over your older, existing pictures. It is also being used during Google searches by providing the searcher options to filter through a more diverse set of skin tones for a makeup tutorial, for example. 

How Does This Affect Us?

Our daily lives are surrounded by technology that reflects us and how we are represented online. This representation is sometimes our first impression for people who haven’t met us. From work meetings on Zoom during Covid-19, to recent family friend’s wedding photos, our photos are what represent us and who we are. This is why it’s important to have technology that is diverse in representation, enabling us to make our best first impression. 

View Comments (1)
More to Discover
About the Contributor
Darsh Bhuva
Darsh Bhuva, Guest Writer
Darsh Bhuva is a junior. This is his first year as a guest writer and he will be publishing interesting articles on phones, cars, and the latest technology. He enjoys procrastinating by surfing the web for new cars and phones that have unique designs or features. He is striving to become a transportation designer in the future and enjoys drawing cars. If you have any topics that you want him to cover related to the things he likes, please reach out to him at [email protected].

Comments (1)

All The Wildcat Roar Picks Reader Picks Sort: Newest

Your email address will not be published. Required fields are marked *

  • H

    HetanshJan 25, 2024 at 9:14 pm

    Beautifully written

    Reply