Carnegie Mellon University
Browse

Viterbi Training for PCFGs: Hardness Results and Competitiveness of Uniform Initialization

Download (227.16 kB)
journal contribution
posted on 2010-07-01, 00:00 authored by Shay B. Cohen, Noah A. Smith

We consider the search for a maximum likelihood assignment of hidden derivations and grammar weights for a probabilistic context-free grammar, the problem approximately solved by “Viterbi training.” We show that solving and even approximating Viterbi training for PCFGs is NP-hard. We motivate the use of uniformat-random initialization for Viterbi EM as an optimal initializer in absence of further information about the correct model parameters, providing an approximate bound on the log-likelihood.

History

Publisher Statement

Copyright 2010 ACL

Date

2010-07-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC