A Comprehensive Comparative Performance Analysis of Eigenfaces, Laplacianfaces and Orthogonal Laplacianfaces for Face Recognition
H.Arora1, K. Tayagi1, P.Sharma1 and K.Jain2
|Related article at Pubmed, Scholar Google|
Facial analysis and recognition is one of the most popular information processing tasks. Face recognition requires some form of dimensionality reduction in the face dataset. There are numerous face representation and recognition techniques which vary in their complexity and efficiency. In this study, we have done a comparative analysis of Eigenfaces based on Principal Component Analysis (PCA), Laplacianfaces based on Locality Preserving Projection (LPP) and Orthogonal Laplacianfaces based on Orthogonal LPP (OLPP), on a standard face database- YALE. PCA is an eigenvector method which aims to preserve the global Euclidean structure of the face space. Its goal is to find a set of mutually orthogonal basis functions that capture the directions of maximum variance in the data. LPP method finds an embedding that preserves local information and obtains a face space that explicitly considers the manifold structure. LPP algorithm aims at finding a linear approximation to the eigen functions of the Laplace Beltrami operator on the face manifold which reflects the intrinsic face manifold structure. OLPP algorithm is based on the LPP algorithm. However, LPP is non-orthogonal and this makes it difficult to reconstruct the data. The OLPP method produces orthogonal basis functions and can have more locality preserving power than LPP. Nearest neighbor classification using OLPP are used to obtain the error rate.