Mixture-of-Parents Maximum Entropy Markov Models

Loading...
Thumbnail Image
Penn collection
Departmental Papers (CIS)
Degree type
Discipline
Subject
Computer Sciences
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Rosenberg, David
Klein, Dan
Contributor
Abstract

We present the mixture-of-parents maximum entropy Markov model (MoP-MEMM), a class of directed graphical models extending MEMMs. The MoP-MEMM allows tractable incorporation of long-range dependencies be- tween nodes by restricting the conditional distribution of each node to be a mixture of distributions given the parents. We show how to efficiently compute the exact marginal posterior node distributions, regardless of the range of the dependencies. This enables us to model non-sequential correlations present within text documents, as well as between in- terconnected documents, such as hyperlinked web pages. We apply the MoP-MEMM to a named entity recognition task and a web page classification task. In each, our model shows significant improvement over the basic MEMM, and is competitive with other long- range sequence models that use approximate inference. 1 Introduction

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2007-07-01
Journal title
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Mixture-of-Parents Maximum Entropy Markov Models (http://www.cis.upenn.edu/%7Etaskar/pubs/mop-memm.pdf), D. Rosenberg, D. Klein (http://www.cs.berkeley.edu/%7Eklein) and B. Taskar. Uncertainty in Artificial Intelligence (UAI) (http://www.cs.duke.edu/uai07/), Vancouver, BC, July 2007. Files licensed under a Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/us/)
Recommended citation
Collection