Author: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic I...
Read MoreAuthor: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic Interlingua Neural MT (NMT) archit...
Read MoreAuthor: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic The Transformer is a state-of-the-art Neural MT ...
Read MoreAuthor: Raj Patel, Machine Translation Scientist @ Iconic In two of our earlier po...
Read MoreAuthor: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic Back in Issue #15, we looked at the topic of...
Read MoreAuthor: Dr. Marta R. Costa-jussà, a Ramón y Cajal Researcher, TALP Research Center, Universitat Politècnica de Catalunya, Barcelona This week, we have a guest post from Marta R. Costa-jussà, a Ramón & Cajal Researcher from the TALP Research Center at the Univ...
Read MoreAuthor: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Zero-shot machine translation - a...
Read MoreAuthor: Dr. Rohit Gupta, Sr. Machine Translation Scientist @ Iconic In Issue#32, we covered the Transformer model for neural machine translation which is the state of the art in neural MT. In this post we explore a technique presented by Zhang et. al. 2018, which modifies the transformer model and speeds up the translation process by 4-7 times across a range of different engines.....
Read MoreAuthor: Dr. Patrik Lambert, Machine Translation Scientist @ Iconic Neural machine translation engines produce systematic errors which are not always easy to detect and correct in an end-to-end framework with millions of hidden parameters. One potential way to resolve these issues is doing so after the fact - correcting the errors by post-processing the output with an automatic post-editing (APE) step. This week we take...
Read More