Tag Archives: Filter

Medha Sarkar with and without Snapchat filter.

An Unfiltered Response to Colorism in Instagram Filters

I have a small addiction to Instagram filters.  I can and have spent too much time finding the craziest filters possible.  There are filters that make you look like cartoons, princesses, and even pirates.  My favorite one is a filter that tints the screen a deep pink and makes it look like glitter is dripping down your face.  But as I explore the vast jungle of filters, it is inevitable that there are some marshes… 

Those marshes come in the form of filters that vastly change your appearance.  I encountered one of those filters on a Wednesday afternoon when I was supposed to be doing homework.  

I was extraordinarily tired from a long day of school and I decided to take a break from the seemingly endless pile of homework by scrolling through some filters.  There were the normal ones, the ones that put strawberries on your cheeks or the ones that make it look like you have rainbow hair.  Then I stumbled upon a filter that made me freeze.  

I had this image in my mind of the creator of this filter sitting down with their phone, sipping a cup of coffee, and then thinking aloud, “How colorist can we be today?” 

This image had pale white skin with red-tinted lips that would make Snow White jealous.  My nose was slimmed down and my jaw was reduced.  As I stared in shock at the image on my screen, a thousand words just rushed into my head.  I subconsciously reached for my computer, angrily typed “blogspot.com” into the search bar, and began to write this.  

Medha Sarkar with an Instagram filter that lightens her skin and changes her nose.
Medha Sarkar with an Instagram filter that lightens her skin and changes her nose.

Now some readers might be asking why an Instagram filter would make my blood boil.  Why didn’t I just scroll to the next filter and forget that it didn’t exist?  

Because that image was clearly meant to make me beautiful.  It was meant to make me achieve that beauty standard – that beauty standard is being white.  The pale skin?  White.  The red lips?  White.  The slim nose?  White.  This filter is telling me that in order to be portrayed as beautiful or pretty, I have to aspire to be a white person.  This isn’t entirely Instagram’s fault though.  Society has decided that looking like white people is the goal.  And it isn’t limited to filters or even appearance.  

I remember when I first moved to a majority-white town, I began to realize that to be a part of the community, you had to throw away all semblance of uniqueness – culture was one of those things.  To gain the acceptance of the community you had to reject your culture. 

One time in my third-grade class, I decided to show some friends the pirouettes I had learned from my Indian Kathak dance lessons.  As I turned around, one of them turned and looked at their friend and began to snicker.  When I asked them why they did that, they said my turns look weird.  When I would bring in food from home, the word “exotic” would be mentioned at least once.  When I would insist that they pronounce my name right, they would give up after two tries and continue to use the white version of my name.  I saw it happen with the other Indian kids at my school. They would introduce themselves with the white version of their name, bring Lunchables to school instead of idlis or sambar, and pursued ballet or “white” activities instead of Hindustani singing or Bharatnatyam.  All of our culture swept under the rug for the sake of the community.  

This is an issue far bigger than filters.  You have to plant a small seed in order to produce a tree.  That can be taking an extra few minutes to try and pronounce someone’s name or treating all food like food, no matter the look or smell.  You can appreciate the culture somebody comes from because it is what makes them radiate.  And you can make that filter you are creating more inclusive by removing the white skin, nose trimmer, and lip tint on it.  It would make all of our lives a little better.


Medha Sarkar is a student starting at Los Gatos High School in the Fall.  She enjoys writing, music, and having a good laugh.


 

Force Facebook to Shut Down WhatsApp

Defects in the design of Facebook’s WhatsApp platform may have led to as many as two dozen people losing their lives in India. With its communications encrypted end-to-end, there is no way for anyone to moderate posts; so WhatsApp has become “an unfiltered platform for fake news and religious hatred,” according to a Washington Post report.

WhatsApp is not used as broadly in the U.S. as in countries such as India, where it has become the dominant mode of mobile communication. But imagine Facebook or Twitter without any filters or moderation — the Wild Wild West they were becoming during the heyday of Cambridge Analytica. Now imagine millions of people who have never been online before becoming dependent on and trusting everything they read there. That gives you a sense of what kind of damage the messaging platform can do in India and other countries.

Earlier this month, India’s Ministry of Electronics and Information Technology sent out a stern warning to WhatsApp, asking it to immediately stop the spread of “irresponsible and explosive messages filled with rumours and provocation.” The Ministry said the platform “cannot evade accountability and responsibility specially when good technological inventions are abused by some miscreants who resort to provocative messages which lead to spread of violence.”

WhatsApp’s response, according to The Wire, was to offer minor enhancements, public education campaigns, and “a new project to work with leading academic experts in India to learn more about the spread of misinformation, which will help inform additional product improvements going forward.” The platform defended its need to encrypt messages and argued that “many people (nearly 25 percent in India) are not in a group” — in other words, only 75 percent of the population is affected!

One of the minor enhancements WhatsApp offered was to put the word “Forwarded” at the top of such messages. But this gives no information about the source of the original message, and even highly educated users could be misled into thinking a source is credible when it isn’t.

WhatsApp owner Facebook is using the same tactics it used when the United Nations found it had played “a determining role” in the genocide against Rohingya refugees in Myanmar: pleading ignorance, offering sympathy and small concessions, and claiming it was unable to do anything about it.

Here is the real issue: Facebook’s business model relies on people’s dependence on its platforms for practically all of their communications and news consumption, setting itself up as their most important provider of factual information — yet it takes no responsibility for the accuracy of that information.

Facebook’s marketing strategy begins with creating an addiction to its platform using a technique that former Google ethicist Tristan Harris has been highlighting: intermittent variable rewards. Casinos use this technique to keep us pouring money into slot machines; Facebook and WhatsApp use it to keep us checking news feeds and messages.

When Facebook added news feeds to its social-media platform, its intentions were to become a primary source of information. It began by curating news stories to suit our interests and presenting them in a feed that we would see on occasion. Then it required us to go through this newsfeed in order to get to anything else. Once it had us trained to accept this, Facebook started monetizing the newsfeed by selling targeted ads to anyone who would buy them.

It was bad enough that, after its acquisition by Facebook, WhatsApp began providing the parent company with all kinds of information about its users so that Facebook could track and target them. But in order to make WhatsApp as addictive as Facebook’s social-media platform, Facebook added chat and news features to it — something it was not designed to accommodate. WhatsApp started off as a private, secure messaging platform; it wasn’t designed to be a news source or a public forum.

WhatsApp’s group-messaging feature is particularly problematic because users can remain anonymous, identified only by a mobile number. A motivated user can create or join unlimited numbers of groups and share hate-filled messages and fake news. What’s worse is that message encryption prevents law-enforcement officials and even WhatsApp itself from viewing what is being said. No consideration was given in the design of the product to the supervision and moderation necessary in public forums.

Facebook needs to be held liable for the deaths that WhatsApp has already caused and be required to take its product off the market until its design flaws are fixed. It isn’t making its defective products available only to sophisticated users who know what they have signed up for; it is targeting people who are first-time technology users, ignorant about the ways of the tech world.

Only by facing penalties and being forced to do a product recall will Facebook be motivated to correct WhatsApp’s defects. The technology industry always finds a way of solving problems when profits are at stake.

Vivek Wadhwa is a Distinguished Fellow at Harvard Law School and Carnegie Mellon’s School of Engineering at Silicon Valley. This piece is partly derived from his new book, “Your Happiness Was Hacked: Why Tech Is Winning the Battle to Control Your Brain — and How to Fight Back”. This has been reprinted with his permission.