Concepedia

Publication | Open Access

Prior Knowledge Integration for Neural Machine Translation using Posterior Regularization

63

Citations

24

References

2017

Year

Abstract

Although neural machine translation has made significant progress recently, how to integrate multiple overlapping, arbitrary prior knowledge sources remains a challenge. In this work, we propose to use posterior regularization to provide a general framework for integrating prior knowledge into neural machine translation. We represent prior knowledge sources as features in a log-linear model, which guides the learning process of the neural translation model. Experiments on Chinese-English translation show that our approach leads to significant improvements.

References

YearCitations

Page 1