A Spectral Algorithm for Learning Hidden Markov Models

Loading...
Thumbnail Image
Penn collection
Statistics Papers
Degree type
Discipline
Subject
hidden Markov models
latent variable models
observable operator models
time series
spectral algorithm
singular value decomposition
learning probability distributions
unsupervised learning
Applied Statistics
Theory and Algorithms
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Hsu, Daniel
Kakade, Sham M
Zhang, Tong
Contributor
Abstract

Hidden Markov Models (HMMs) are one of the most fundamental and widely used statistical tools for modeling discrete time series. In general, learning HMMs from data is computationally hard (under cryptographic assumptions), and practitioners typically resort to search heuristics which suffer from the usual local optima issues. We prove that under a natural separation condition (bounds on the smallest singular value of the HMM parameters), there is an efficient and provably correct algorithm for learning HMMs. The sample complexity of the algorithm does not explicitly depend on the number of distinct (discrete) observations—it implicitly depends on this quantity through spectral properties of the underlying HMM. This makes the algorithm particularly applicable to settings with a large number of observations, such as those in natural language processing where the space of observation is sometimes the words in a language. The algorithm is also simple, employing only a singular value decomposition and matrix multiplications.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2012-09-01
Journal title
Journal of Computer and System Sciences
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection