The Neural MT Weekly

Issue-29-Improving-Robustness-in-Neural-MT

Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic

Despite the high level of performance in current Neural MT engines, there remains a significant issue with robustness when it comes to unexpected, noisy input. When the input is not clean, the quality of the output drops drastically. In this issue, we will take a look at the impact of various types of 'noise' on...

Read More
Issue-28-–-Hybrid-Unsupervised-Machine-Translation

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

In Issue #11 of this series, we first looked directly at the topic of unsupervised machine translation - training an engine without any parallel data. Since then, it has gone from a promising concept, to one that can produce effective systems that perform close to the level of fully supervised engines (trained with parallel data). The...

Read More
Issue-26-Context-and-Copying-in-Neural-MT

Author: Raj Patel, Machine Translation Scientist @ Iconic

When translating from one language to another, certain words and tokens need to be copied, and not translated, per se, in the target sentence. This includes things like proper nouns, names, numbers, and 'unknown' tokens. We want these to appear in the translation just as they were in the original text. Neural MT systems with...

Read More
Issue-25-Improving-Neural-MT-with-Cross-Language-Model-Pretraining

Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic

One of the reasons for the success of Neural MT, and deep learning techniques in general, is the more effective and efficient utilization of large amounts of training data without too much overhead in terms of the time it takes to infer, and the size of the resulting models. This also opens the door to...

Read More
Issue-24-Exploring-language-models-for-Neural-MT

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

Monolingual language models were a critical part of Phrase-based Statistical Machine Translation systems. They are also used in unsupervised Neural MT systems (unsupervised means that no parallel data is available to supervise training, in other words only monolingual data is used). However, they are not used in standard supervised Neural MT engines and training language...

Read More
Issue-23-Unbiased-Neural-MT

Author: Raj Patel, Machine Translation Scientist @ Iconic

A recent topic of conversation and interest in the area of Neural MT - and Artificial Intelligence in general - is gender bias. Neural models are trained using large text corpora which inherently contain social biases and stereotypes, and as a consequence, translation models inherit these biases. In this article, we’ll try to understand how gender...

Read More