A unigram orientation model for statistical machine translation
Christoph Tillmann
NAACL-HLT 2004
This paper presents a novel training algorithm for a linearly-scored block sequence translation model. The key component is a new procedure to directly optimize the global scoring function used by a SMT decoder. No translation, language, or distortion model probabilities are used as in earlier work on SMT. Therefore our method, which employs less domain specific knowledge, is both simpler and more extensible than previous approaches. Moreover, the training procedure treats the decoder as a black-box, and thus can be used to optimize any decoding scheme. The training algorithm is evaluated on a standard Arabic-English translation task. © 2006 Association for Computational Linguistics.
Christoph Tillmann
NAACL-HLT 2004
Tong Zhang
Neural Computation
Jinbo Bi, Tong Zhang, et al.
KDD 2004
Vijay S. Iyengar, Chidanand Apte, et al.
KDD 2000