The Neural MT Weekly

NMT Issue 36 Average Attention Network for Neural Machine Translation

Author: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic

In Issue#32, we covered the Transformer model for neural machine translation which is the state of the art in neural MT. In this post we explore a technique presented by Zhang et. al. 2018, which modifies the transformer model and speeds up the translation process by 4-7 times across a range of different engines.....

Read More
Issue-35-Text-Repair-Model-for-Neural-Machine-Translation

Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic

Neural machine translation engines produce systematic errors which are not always easy to detect and correct in an end-to-end framework with millions of hidden parameters. One potential way to resolve these issues is doing so after the fact - correcting the errors by post-processing the output with an automatic post-editing (APE) step. This week we take...

Read More