
Departmental Papers (CIS)
Date of this Version
7-2010
Document Type
Journal Article
Recommended Citation
Kuzman Ganchev, João Graça, Jennifer Gillenwater, and Ben Taskar, "Posterior Regularization for Structured Latent Varaible Models", . July 2010.
Abstract
We present posterior regularization, a probabilistic framework for structured, weakly supervised learning. Our framework efficiently incorporates indirect supervision via constraints on posterior distributions of probabilistic models with latent variables. Posterior regularization separates model complexity from the complexity of structural constraints it is desired to satisfy. By directly imposing decomposable regularization on the posterior moments of latent variables during learning, we retain the computational efficiency of the unconstrained model while ensuring desired constraints hold in expectation. We present an efficient algorithm for learning with posterior regularization and illustrate its versatility on a diverse set of structural constraints such as bijectivity, symmetry and group sparsity in several large scale experiments, including multi-view learning, cross-lingual dependency grammar induction, unsupervised part-of-speech induction, and bitext word alignment.
Date Posted: 16 July 2012
This document has been peer reviewed.
Comments
Posterior Regularization for Structured Latent Variable Models , K. Ganchev, J. Graca, J. Gillenwater and B. Taskar, Journal of Machine Learning Research (JMLR), July 2010.
Copyright held by the authors.