Carnegie Mellon University
Browse
file.pdf (330.09 kB)

Knowledge-Rich Morphological Priors for Bayesian Language Models

Download (330.09 kB)
journal contribution
posted on 2013-06-01, 00:00 authored by Victor Chahuneau, Noah A. Smith, Chris Dyer

We present a morphology-aware nonparametric Bayesian model of language whose prior distribution uses manually constructed finitestate transducers to capture the word formation processes of particular languages. This relaxes the word independence assumption and enables sharing of statistical strength across, for example, stems or inflectional paradigms in different contexts. Our model can be used in virtually any scenario where multinomial distributions over words would be used. We obtain state-of-the-art results in language modeling, word alignment, and unsupervised morphological disambiguation for a variety of morphologically rich languages.

History

Publisher Statement

Copyright 2013 The Association for Computational Linguistics

Date

2013-06-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC