Nothing Special   »   [go: up one dir, main page]

skip to main content
article

Separable linear discriminant analysis

Published: 01 December 2012 Publication History

Abstract

Linear discriminant analysis (LDA) is a popular technique for supervised dimension reduction. Due to the curse of dimensionality usually suffered by LDA when applied to 2D data, several two-dimensional LDA (2DLDA) methods have been proposed in recent years. Among which, the Y2DLDA method, introduced by Ye et al. (2005), is an important development. The idea is to utilize the underlying 2D data structure to seek for an optimal bilinear transformation. However, it is found that the proposed algorithm does not guarantee convergence. In this paper, we show that the utilization of a bilinear transformation for 2D data is equivalent to modeling the covariance matrix of 2D data as separable covariance matrix. Based on this result, we propose a novel 2DLDA method called separable LDA (SLDA). The main contributions of SLDA include (1) it provides interesting theoretical relationships between LDA and some 2DLDA methods; (2) SLDA provides a building block for mixture extension; (3) unlike Y2DLDA, a neatly analytical solution can be obtained as that in LDA. Empirical results show that our proposed SLDA achieves better recognition performance than Y2DLDA while being computationally much more efficient.

References

[1]
Eigenfaces vs. Fisherfaces: recognition using class specific linear projection. IEEE Trans. Pattern Anal. Mach. Intell. v19 i7. 711-720.
[2]
Estimating stationary dipoles from MEG/EEG data contaminated with spatially and temporally correlated background noise. IEEE Trans. Signal Process. v50 i7. 1565-1572.
[3]
Ding, C., Ye, J., 2005. 2-dimensional singular value decomposition for 2D maps and images. In: Proceedings of SIAM International Conference on Data Mining, SDM 2005.
[4]
Factored principal components analysis, with applications to face recognition. Stat. Comput. v19 i3. 229-238.
[5]
Comparison of discrimination methods for the classification of tumors using gene expression data. J. Amer. Statist. Assoc. v97 i457. 77-87.
[6]
Regularized discriminant analysis. J. Amer. Statist. Assoc. v84 i405. 165-175.
[7]
Introduction to Statistical Pattern Classification. 1990. Academic Press.
[8]
Comments on "On image matrix based feature extraction algorithms". IEEE Trans. Syst. Man Cybern. B. v37 i5. 1373-1374.
[9]
Gu, Q.Q., Zhou, J., 2009. Two dimensional maximum margin criterion. In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2009, vols. 1-8, pp. 1621-1624.
[10]
Matrix Variate Distributions. Chapman and Hall-CRC.
[11]
Discriminant analysis by Gaussian mixtures. J. R. Stat. Soc. Ser. B Stat. Methodol. v58 i1. 155-176.
[12]
The Elements of Statistical Learning: Data Mining, Inference, and Prediction. second ed. Springer.
[13]
Inoue, K., Urahama, K., 2006. Non-iterative two-dimensional linear discriminant analysis. In: 18th International Conference on Pattern Recognition, ICPR 18, vol. 2, pp. 540-543.
[14]
Generalized linear discriminant analysis: a unified framework and efficient model selection. IEEE Trans. Neural Netw. v19 i10. 1768-1782.
[15]
2D-LDA: a statistical linear discriminant analysis for image matrix. Pattern Recognit. Lett. v26 i5. 527-532.
[16]
Algebraic feature extraction for image recognition based on an optimal discriminant criterion. Pattern Recognit. v26 i6. 903-911.
[17]
Luo, D., Ding, C., Huang, H., 2009. Symmetric two dimensional linear discriminant analysis. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2820-2827.
[18]
Spatial-temporal analysis of multivariate environmental monitoring data. In: Multivariate Environmental Statistics, Elsevier, Amsterdam, The Netherlands. pp. 347-386.
[19]
Maximum likelihood estimation via the ECM algorithm: a general framework. Biometrika. v80 i2. 267-278.
[20]
(2D)2 LDA: an efficient approach for face recognition. Pattern Recognit. v39 i7. 1396-1400.
[21]
The FERET evaluation methodology for face recognition algorithms. IEEE Trans. Pattern Anal. Mach. Intell. v22 i10. 1090-1104.
[22]
Serrano, Ángel, de Diego, Martín, Conde, I., Cabello, C., Shen, E., Bai, L., 2007. Influence of wavelet frequency and orientation in an SVM-based parallel gabor PCA face verification system. In: Proceedings of the 8th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2007, pp. 219-228.
[23]
Asymptotic Statistics. first ed. Cambridge University Press.
[24]
Finite mixtures of matrix normal distributions for classifying three-way data. Stat. Comput. v21 i4. 511-522.
[25]
On image matrix based feature extraction algorithms. IEEE Trans. Syst. Man Cybern. B. v36 i1. 194-197.
[26]
Estimating MIMO channel covariances from training data under the Kronecker model. Signal Process. v89 i1. 1-13.
[27]
Two-dimensional FLD for face recognition. Pattern Recognit. v38 i7. 1121-1124.
[28]
Multilinear discriminant analysis for face recognition. IEEE Trans. Signal Process. v16 i1. 212-220.
[29]
Two-dimensional discriminant transform for face recognition. Pattern Recognit. v38 i7. 1125-1129.
[30]
Generalized low rank approximations of matrices. Mach. Learn. v61. 167-191.
[31]
Ye, J., Janardan, R., Li, Q., 2005. Two-dimensional linear discriminant analysis. In: The Eighteenth Annual Conference on Neural Information Processing Systems, NIPS 2004, pp. 1569-1576.
[32]
(2D)2PCA: two-directional two-dimensional PCA for efficient face representation and recognition. Neurocomputing. v69 i1-3. 224-231.
[33]
1D-LDA vs. 2D-LDA: when is vector-based linear discriminant analysis better than matrix-based?. Pattern Recognit. v41 i7. 2156-2172.

Cited By

View all
  • (2024)Matrix-based vs. vector-based linear discriminant analysisInformation Sciences: an International Journal10.1016/j.ins.2023.119872654:COnline publication date: 1-Jan-2024
  • (2021)EEG Signal Classification Using Manifold Learning and Matrix-Variate Gaussian ModelComputational Intelligence and Neuroscience10.1155/2021/66688592021Online publication date: 1-Jan-2021
  • (2015)2DLDA as matrix-variate formulation of a separable 1DLDAPattern Recognition Letters10.1016/j.patrec.2015.09.01368:P1(169-175)Online publication date: 15-Dec-2015
  1. Separable linear discriminant analysis

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Computational Statistics & Data Analysis
    Computational Statistics & Data Analysis  Volume 56, Issue 12
    December, 2012
    682 pages

    Publisher

    Elsevier Science Publishers B. V.

    Netherlands

    Publication History

    Published: 01 December 2012

    Author Tags

    1. Face recognition
    2. Linear discriminant analysis
    3. Separable
    4. Two-dimensional data

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 16 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Matrix-based vs. vector-based linear discriminant analysisInformation Sciences: an International Journal10.1016/j.ins.2023.119872654:COnline publication date: 1-Jan-2024
    • (2021)EEG Signal Classification Using Manifold Learning and Matrix-Variate Gaussian ModelComputational Intelligence and Neuroscience10.1155/2021/66688592021Online publication date: 1-Jan-2021
    • (2015)2DLDA as matrix-variate formulation of a separable 1DLDAPattern Recognition Letters10.1016/j.patrec.2015.09.01368:P1(169-175)Online publication date: 15-Dec-2015

    View Options

    View options

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media