The Neural MT Weekly


Author: Dr. John Tinsley, CEO @ Iconic

A little over a year ago, Koehn and Knowles (2017) wrote a very appropriate paper entitled “Six Challenges in Neural Machine Translation” (in fact, there were 7 but only 6 were empirically tested). The paper set out a number of areas which, despite its rapid development, still needed to be addressed by researchers and developers of Neural MT....

Read More

Author: Raj Nath Patel, Machine Translation Scientist @ Iconic

Machine Translation typically operates with a fixed vocabulary, i.e. it knows how to translate a finite number of words. This is obviously an issue, because translation is an open vocabulary problem: we might want to translate any possible word! This is a particular issue for Neural MT where the vocabulary needs to be limited at the...

Read More

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

“Garbage in, Garbage out” - noisy data is a big problem for all machine learning tasks, and MT is no different. By noisy data, we mean bad alignments, poor translations, misspellings, and other inconsistencies in the data used to train the systems. Statistical MT systems are more robust, and can cope with up to 10% noise in...

Read More

Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic

Training a neural machine translation engine is a time consuming task. It typically takes a number of days or even weeks, when running powerful GPUs. Reducing this time is a priority of any neural MT developer. In this post we explore a recent work (Ott et al, 2018), whereby, without compromising the translation quality, they...

Read More

The field of Machine Translation is moving at as fast a pace as we've ever seen. Month on month, there is an increase in the number of research papers being published, with the majority obviously focusing on Neural MT. As a company at the forefront of this technology, it's critically important that we at Iconic stay up to date. Our world-class team of MT research scientists...

Read More