Translation ex machina_Shanghai Translation Company

发表时间:2019/01/25 00:00:00  浏览次数:1231  

Most people, at least once in their lives, have probably used a free, online translation tool to quickly find out what a word or phrase means, but not many people actually know what’s happening behind the scenes.

Google Translate is just one of many engines performing so-called machine translation, usually simply referred to as MT. As opposed to human translation, machine translation is produced without human input.

There are various types of MT systems; rule-based, statistical and neural being the most common.

Rule-based machine translation

Rule-based MT engines (sometimes referred to as “the classical approach”) were introduced in early 1970s, and are still used today. They rely on hand-written rules and dictionaries which make them very time-consuming and a costly investment, but on the other hand, they provide total control of the output and don’t require bilingual texts as a basis, so can even be used for languages that don’t have any existing translations in common.

Statistical machine translation

The most popular system is statistical MT. The very idea of it goes way back to 1949, and stems from information theory, but it wasn’t really developed until the late 1980s. This system relies on bilingual information such as translation memories and glossaries, so that it can analyse patterns, and can then translate according to the statistical probability that a given sentence in the source language needs to be translated in a certain way in the target language.

Neural machine translation

Last but not least, there is neural MT (NMT), which works based on usage of a large neural network (a computing system inspired by how a brain works).

This is the newest system, introduced in 2016, but you can already find it powering popular services such as Facebook, Amazon and Google.

The basis for NMT is vector representations of words that are mapped according to their meaning (similarly to how humans make associations between various ideas), allowing the neural network to learn relations between them. Though NMT requires more time to train than other systems before it can be used, the translations it produces are more fluent than those from other MT systems.

While the Star Trek Universal Translator or Douglas Adams’ Babelfish are still just fantasies, we already have a lot of tools and technology that, if used properly, can help us to meet the growing demand for instant translation.

查看评论[0]文章评论