Translate
Gender Neutrality in Languages, and How Did Google Translate Fail to Realize That
December 16, 2022
I am no stranger to using online translators to look up words that I didn't know or that I forgot for my essay back when I was in high school, learning English as a second language. It however remains the unspoken truth at that time that the translator machines can't be entirely trusted as they often gave gibberish or hard-to-understand sentences. What if I told you that these translators can also be sexist regarding their results? Wild right?
But before we get into this, let's get familiar with the term "gender-neutrality" in the language aspect. In some languages, like English and Thai, pronouns have the gender distinction (he/she, เขา/เธอ, etc.). Some languages, like Turkish and Hungarian, have no gender distinction in pronouns (o for both feminine/masculine in Turkish) (1).
Therefore, one of the most important things when it comes to implementing automated translators is to make sure that the translators are able to convey the concept of grammatical genders between these two types of languages. However, the two posts below from Facebook and Twitter show the problematic translations for gender-neutral sentences. From the posts, it seems that Google Translate tried to associate Hungarian and Turkish gender-neutral pronouns with one of the English gender-specific pronouns.
Hungarian is a gender neutral language, it has no gendered pronouns, so Google Translate automatically chooses the gender for you. Here is how everyday sexism is consistently encoded in 2021. Fuck you, Google. pic.twitter.com/EPqkEw5yEQ
— Dora Vargha @doravargha@mstdn.social (@DoraVargha) March 20, 2021
Now (Checked on December 16, 2022), if you actually try to recreate the texts from both posts, you will see Google's solutions to this problem: providing both alternative translations and displaying a warning if gender-neutral pronouns are detected. For example, if you were to type "o tembel" into Google Translate where "tembel" means lazy:
Even if this has been resolved, this is a great reminder for developers behind automated translators to pay attention to how they implement their algorithms, so that no one does not have to be the victim of sexist translators again.
References
1. Farkas, A., & Németh, R. (2022). How to measure gender bias in machine translation: Real-world oriented machine translators, multiple reference points. Social Sciences & Humanities Open, 5(1), 100239. https://doi.org/10.1016/j.ssaho.2021.100239