Who knew something as simple as an online translation from Google translate could be gender bias?
That’s exactly what Jihyun Kim, a senior from the Class of 2018, found out in her Politics of Code class at NYU Abu Dhabi.
For example, the Turkish language uses a gender neutral third person pronoun “o”, unlike English which uses he or she. When you translate Turkish into English, a word like engineer translates to a he, while the word nurse translates to a she.
All semester, Kim’s class questioned the neutrality of software systems like Google translate, and the effect social media has on the general public.
The class is taught by Professor Pierre Depaz, lecturer of interactive media.
Applications in the Real World
The course assumes that softwares are political and can have an impact on politics and urban life.
To better understand this, students are introduced to media studies and political theory, besides workshops on JavaScript and Python, and tasked to explore solutions in building alternative software systems.
It was especially timely seeing the ways in which Facebook, which claimed to be a non-political software platform, has influenced world politics.
Kim was referring to news in March 2018 that Facebook data was harvested to swing political views of its users during the 2016 United States presidential election. The scandal led to an congressional investigation, and later prompted Facebook to notify users if their data was breached, as well as sweeping privacy policy changes.
Kim, a South Korean native, was quick to point out, however, that social media in the political realm is not all negative. After all, social media had played a huge role in impeaching South Korean president Park Geun-hye in 2017, by getting critical information out to the public swiftly.
Impact of Human Biases in Software Codes
The course allowed Kim to see computer science in a new light. It is not just about coding, she concluded, but understanding how the designs and implementation of softwares can have powerful impact on politics, as well as urban living.
Human biases, like the gender biases seen within Google translate may seem harmless, but the class explores deeper questions such as: what about when racial biases are used for crime prediction, or insurance premiums? Biases in these softwares could mean people of a certain race will have a higher rate of being flagged as potential criminals, and other individuals might have to incur a higher insurance premiums due to such profiling. Software biases can have large consequences on individual’s life, Kim said.
“Taking this class has definitely expanded the scope of my thinking to both understand biases embedded in software, and correct them where I can as a future software engineer,” Kim added.
After graduation, Kim heads to Singapore to work as software engineer with an international investment bank, and is excited to get started in the working world with a new perspective.