Carnegie Mellon University
Browse
file.pdf (315.82 kB)

Oriented Discriminant Analysis (ODA)

Download (315.82 kB)
journal contribution
posted on 2004-01-01, 00:00 authored by Fernando de la Torre, Takeo Kanade
Linear discriminant analysis (LDA) has been an active topic of research during the last century. However, the existing algorithms have several limitations when applied to visual data. LDA is only optimal for gaussian distributed classes with equal covariance matrices and just classes-1 features can be extracted. On the other hand, LDA does not scale well to high dimensional data (over-fitting) and it does not necessarily minimize the classification error. In this paper, we introduce Oriented Discriminant Analysis (ODA), a LDA extension which can overcome these drawbacks. Three main novelties are proposed: • An optimal dimensionality reduction which maximizes the Kullback- Liebler divergence between classes is proposed. This allows us to model class covariances and to extract more than classes-1 features. • Several covariance approximations are introduced to improve classification in the small sample case. • A linear time iterative majorization method is introduced in order to find a local optimal solution. Several synthetic and real experiments on face recognition are reported.

History

Date

2004-01-01

Usage metrics

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC