Low Resourced Languages

Issue-28-–-Hybrid-Unsupervised-Machine-Translation

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

In Issue #11 of this series, we first looked directly at the topic of unsupervised machine translation - training an engine without any parallel data. Since then, it has gone from a promising concept, to one that can produce effective systems that perform close to the level of fully supervised engines (trained with parallel data). The...

Read More
Issue-20-Dynamic-Vocabulary-in-Neural-MT

Author: Dr. Raj Patel, Machine Translation Scientist @ Iconic

As has been covered a number of times in this series, Neural MT requires good data for training, and acquiring such data for new languages can be costly and not always feasible. One approach in Neural MT literature for improving translation quality for low-resource language is transfer-learning. A common practice is to reuse the model...

Read More