Departmental Papers (ESE)

Document Type

Conference Paper

Date of this Version

July 2008

Comments

Reprinted from the Proceedings of the 25th International Conference on Machine Learning (ICML 2008), July 2008.
URL: http://icml2008.cs.helsinki.fi/index.shtml

Abstract

In this paper we propose a discriminant learning framework for problems in which data consist of linear subspaces instead of vectors. By treating subspaces as basic elements, we can make learning algorithms adapt naturally to the problems with linear invariant structures. We propose a unifying view on the subspace-based learning method by formulating the problems on the Grassmann manifold, which is the set of fixed-dimensional linear subspaces of a Euclidean space. Previous methods on the problem typically adopt an inconsistent strategy: feature extraction is performed in the Euclidean space while non-Euclidean distances are used. In our approach, we treat each subspace as a point in the Grassmann space, and perform feature extraction and classification in the same space. We show feasibility of the approach by using the Grassmann kernel functions such as the Projection kernel and the Binet-Cauchy kernel. Experiments with real image databases show that the proposed method performs well compared with state-of- the-art algorithms.

Share

COinS

Date Posted: 12 December 2008