Learning a kernel matrix for nonlinear dimensionality reduction

Loading...
Thumbnail Image
Penn collection
Departmental Papers (CIS)
Degree type
Discipline
Subject
kernels
machine learning
Computer Sciences
Funder
Grant number
License
Copyright date
Distributor
Related resources
Contributor
Abstract

We investigate how to learn a kernel matrix for high dimensional data that lies on or near a low dimensional manifold. Noting that the kernel matrix implicitly maps the data into a nonlinear feature space, we show how to discover a mapping that unfolds the underlying manifold from which the data was sampled. The kernel matrix is constructed by maximizing the variance in feature space subject to local constraints that preserve the angles and distances between nearest neighbors. The main optimization involves an instance of semidefinite programming---a fundamentally different computation than previous algorithms for manifold learning, such as Isomap and locally linear embedding. The optimized kernels perform better than polynomial and Gaussian kernels for problems in manifold learning, but worse for problems in large margin classification. We explain these results in terms of the geometric properties of different kernels and comment on various interpretations of other manifold learning algorithms as kernel methods.

Advisor
Date of presentation
2004-07-04
Conference name
Departmental Papers (CIS)
Conference dates
2023-05-16T21:26:59.000
Conference location
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Proceedings of the Twenty First International Conference on Machine Learning (ICML 2004), held 4-8 July 2004, Banff, Alberta, Canada. matlab supplement (http://www.seas.upenn.edu/~kilianw/sde/download.htm)
Recommended citation
Collection