Carnegie Mellon University
Browse

Language Modeling with Power Low Rank Ensembles

Download (402.33 kB)
journal contribution
posted on 2014-10-01, 00:00 authored by Ankur P. Parikh, Avneesh Saluja, Chris Dyer, Eric P Xing

We present power low rank ensembles (PLRE), a flexible framework for n-gram language modeling where ensembles of low rank matrices and tensors are used to obtain smoothed probability estimates of words in context. Our method can be understood as a generalization of ngram modeling to non-integer n, and includes standard techniques such as absolute discounting and Kneser-Ney smoothing as special cases. PLRE training is efficient and our approach outperforms state-of-the-art modified Kneser Ney baselines in terms of perplexity on large corpora as well as on BLEU score in a downstream machine translation task.

History

Date

2014-10-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC