Multiplicative Updates for Nonnegative Quadratic Programming
Penn collection
General Robotics, Automation, Sensing and Perception Laboratory
Degree type
Discipline
Subject
Medicine and Health Sciences
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
Many problems in neural computation and statistical learning involve optimizations with nonnegativity constraints. In this article, we study convex problems in quadratic programming where the optimization is confined to an axis-aligned region in the nonnegative orthant. For these problems, we derive multiplicative updates that improve the value of the objective function at each iteration and converge monotonically to the global minimum. The updates have a simple closed form and do not involve any heuristics or free parameters that must be tuned to ensure convergence. Despite their simplicity, they differ strikingly in form from other multiplicative updates used in machine learning.We provide complete proofs of convergence for these updates and describe their application to problems in signal processing and pattern recognition.