Why Google Translate Is Not the Best Tool for E-learning Translations
Are you planning to use Google Translate to translate the content of your online course? Well, it's better you don't. Read on to find out why.
Jim is the product training manager of a multinational pharmaceutical company. Recently, his boss asked him to get the content of the online course on their latest antibiotic translated into 8 languages. Jim, who was running short of training dollars, used Google Translate to accomplish the task. And, the result – a complete disaster. The translations were riddled with lexical errors and grammatical mistakes.
Why is the output of Google Translate not very accurate?
To get the answer, we have to know how Google Translate works. The application uses a technique called Statistical Machine Translation (SMT). An SMT tool refers a pre-translated corpus of texts to translate a piece of text from one language to another. The accuracy of the translation is a function of the size and quality of the corpus of texts and the ability of the tool to refer correctly. Let us look at an example to understand better.
Suppose you need to translate the German sentence Ich fahre mit dem Bus into English. Now this sentence could mean two things – I travel by bus or I am travelling in a bus. When Google Translate finds the two meanings in the corpus of texts, it is “confused”, as it is not aware of the context in which the sentence is used. And, this could lead to a translation of poor quality.
Let us now look some other limitations of Google Translate.
Inability to distinguish homographs
Homographs are words with the same spelling but different meanings. And, the inability to distinguish homographs could lead to poor results. For instance, one of my friends tried to translate the sentence I will discount the bill with a bank into German, using Google Translate. The application translated the word bill as Rechnung, which is the term used for a ‘bill’, we usually receive from the shopkeeper after buying a product. Surely, you wouldn’t like this sort of mistake to appear in the translation of your online course.
Use of an “intermediate” language in certain cases
Another important limitation of Google Translate (or for that matter, most SMT tools) is that it does not directly translate content for certain language pairs. It first converts the text in the source language into English and then converts the output in English into the target language. For instance, when I tried to translate an Arabic passage into Telugu, a south Asian language, using Google Translate, it first translated the Arabic passage into English and then translated this English content into Telugu. This resulted in a very poor translation as the essence of the Arabic source was lost.
Problems crop up when linguistic nuances have to be translated
Verbal nuances are jargons specific to a particular language. And, many a time, e-learning courses contain verbal nuances that are very hard to translate using Google Translate. This is because there is no equivalent word in the target language. If the jargon is not interpreted accurately, it could lead to confusion because the user is not able to obtain the right information from the translation. A fine example to illustrate this point is the Japanese word “itadakimasu”. A literal translation of the phrase would mean I’ll have it. But, the phrase is used in a much wider sense – to appreciate the person who cooked the food, the person who served it, and all good things connected with eating. Subtleties such as these cannot be understood by Google Translate.
Google Translate is an SMT-based tool that relies on a corpus of pre-translated texts to render text from one language to another. It is better not to use the tool for translating e-learning content as it cannot “understand” the context of the course. Moreover, Google Translate can’t distinguish homographs. It uses an “intermediate” language for certain language pairs and cannot understand most linguistic nuances. Hope you liked this post.
How do you translate your online courses? We’d love to know.