Carnegie Mellon University
file.pdf (181.61 kB)
Download file

Bayesian Language Modelling of German Compounds

Download (181.61 kB)
journal contribution
posted on 2012-12-01, 00:00 authored by Jan Botha, Chris Dyer, Phil Blunsom

In this work we address the challenge of augmenting n-gram language models according to prior linguistic intuitions. We argue that the family of hierarchical Pitman-Yor language models is an attractive vehicle through which to address the problem, and demonstrate the approach by proposing a model for German compounds. In our empirical evaluation the model outperforms a modified Kneser-Ney n-gram model in test set perplexity. When used as part of a translation system, the proposed language model matches the baseline BLEU score for English→German while improving the precision with which compounds are output. We find that an approximate inference technique inspired by the Bayesian interpretation of Kneser-Ney smoothing (Teh, 2006) offers a way to drastically reduce model training time with negligible impact on translation quality


Publisher Statement

Copyright 2012 The COLING 2012 Organizing Committee.