Oriented Discriminant Analysis (ODA)
2004-01-01T00:00:00Z (GMT) by
Linear discriminant analysis (LDA) has been an active topic of research during the last century. However, the existing algorithms have several limitations when applied to visual data. LDA is only optimal for gaussian distributed classes with equal covariance matrices and just classes-1 features can be extracted. On the other hand, LDA does not scale well to high dimensional data (over-fitting) and it does not necessarily minimize the classification error. In this paper, we introduce Oriented Discriminant Analysis (ODA), a LDA extension which can overcome these drawbacks. Three main novelties are proposed: • An optimal dimensionality reduction which maximizes the Kullback- Liebler divergence between classes is proposed. This allows us to model class covariances and to extract more than classes-1 features. • Several covariance approximations are introduced to improve classification in the small sample case. • A linear time iterative majorization method is introduced in order to find a local optimal solution. Several synthetic and real experiments on face recognition are reported.