Abstract
This paper provides an algorithmic framework for learning statistical models involving directed spanning trees, or equivalently non-projective dependency structures. We show how partition functions and marginals for directed spanning trees can be computed by an adaptation of Kirchhoff's Matrix-Tree Theorem. To demonstrate an application of the method, we perform experiments which use the algorithm in training both log-linear and max-margin dependency parsers. The new training methods give improvements in accuracy over perceptron-trained models.
Original language | English |
---|---|
Pages | 141-150 |
Number of pages | 10 |
State | Published - 2007 |
Externally published | Yes |
Event | 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, EMNLP-CoNLL 2007 - Prague, Czech Republic Duration: 28 Jun 2007 → 28 Jun 2007 |
Conference
Conference | 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, EMNLP-CoNLL 2007 |
---|---|
Country/Territory | Czech Republic |
City | Prague |
Period | 28/06/07 → 28/06/07 |