International Journal of Theoretical and Applied Mathematics

| Peer-Reviewed |

A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap

Received: 10 August 2019    Accepted: 26 August 2019    Published: 10 September 2019
Views:       Downloads:

Share This Article

Abstract

Linear model (LM) provide the advance in regression analysis, where it was considered an important statistical development of the last fifty years, following general linear model (GLM), principal component analysis (PCA) and constrained principal component analysis (CPCA) in the last thirty years. This paper introduce a series of papers prepared within the framework of an international workshop. Firstly, the LM and GLM has been discussed. Next, an overview of PCA has been presented. Then constrained principal component has been shown. Some of its special cases such as PCA, Canonical correlation analysis (CANO), Redundancy analysis (RA), Correspondence analysis (CA), Growth curve models (GCM), Extended growth curve models (ExGCM), Canonical discriminant analysis (CDA), Constrained correspondence analysis, non-symmetric correspondence analysis, Multiple Set CANO, Multiple Correspondence Analysis, Vector Preference Models, Seemingly unrelated regression (SUR), Weighted low rank approximations, Two-Way canonical decomposition with linear constraints, and Multilevel RA has been noted in this paper. Related methods and ordinary least squares (OLS) estimator as a special case form CPCA has been introduced. Finally, an example has been introduced to indicate the importance of CPCA and the different between PCA and CPCA. Where CPCA is a method for structural analysis of multivariate data that combine features of regression analysis and principal component analysis. In this method, the original data first decomposed into several components according to external information. The components then subjected to principal component analysis to explore structures within the components.

DOI 10.11648/j.ijtam.20190502.11
Published in International Journal of Theoretical and Applied Mathematics (Volume 5, Issue 2, April 2019)
Page(s) 21-30
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

General Linear Model, Principal Component Analysis, Constrained Principal Component Analysis, Bootstrap

References
[1] Batah, M. Özkale, M. and Gore, S. (2009) "Combining Unbiased Ridge and Principal Component Regression Estimators" Communications in Statistics - Theory and Methods, 38, 2201–2209.
[2] Guisan, A., Edwards, T. and Hastie, T. (2002) "Generalized linear and generalized additive models in studies of species distributions: setting the scene" Ecological Modelling, 157, 89-100.
[3] Gunst, R. F., Mason, R. L. (1977) "Biased estimation in regression: an evaluation using mean squared error" Journal of the American Statistical Association, 72, 616-682.
[4] Hefferon, J. (2012) “Linear Algebra” Mathematics Department, Saint Michael's College. http://joshua.smcvt.edu/linearalgebra.
[5] Hotelling, H. (1936) "Relations between two sets of variables" Biometrika, 28, 321–377.
[6] Hunter, M. and Takane, Y. (2002) “Constrained Principal Component Analysis: Various Applications” Journal of Educational and Behavioral Statistics, 27, 105- 145.
[7] Kruger, U., Zhang, J. and Xie, L. (2008) “Developments and Applications of Nonlinear Principal Component Analysis – a Review” Springer Berlin Heidelberg, 58, 1-43.
[8] Lambert, Z. V., Wildt, A. R. and Durand, R. M. (1988) "Redundancy analysis: An alternative to canonical correlation and multivariate multiple regression in exploring interset associations" Psychological Bulletin, 104, 282–289.
[9] Marquardt, D. (1970) "Generalized inverse, ridge regression, biased linear estimation and nonlinear estimation" Technometrics, 12, 591-612.
[10] Massy, W. F. (1965) "Principal component regression in explanatory statistical research" Journal of the American Statistical Association, 60, 234-256.
[11] MIT (2011) “Left and right inverses; pseudoinverse” Massachusetts Institute of Technology (MIT), OpenCourseWare Linear Algebra, 1-4.
[12] Pearson, K. (1901) "On lines and planes of closest fit to systems of points in space" Philosophical Magazine, 2, 559–572.
[13] Takane, Y. (1997) "CPCA: A Comprehensive Theory" Department of Psychology, McGill University Montreal, Quebec H3A lB1, CANADA, 35- 40.
[14] Takane, Y. (2014) "Constrained Principal Component Analysis and Related Techniques" CRC Press Taylor and Francis Group.
[15] Takane, Y. and Hunter, M. (2001) "Constrained Principal Component Analysis: A Comprehensive Theory" Applicable Algebra in Engineering, communication and computing, 12, 391–419.
[16] Takane, Y., Kiers, H. A. L., and de Leeuw, J. (1995) "Component analysis with different sets of constraints on different dimensions" Psychometrika, 60, 259-280.
[17] Takane, Y., and Shibayama, T. (1991) "Principal Component Analysis With External Information on Both cases and Variables" Psychometrica Mcgill University, 56, 97-120.
Author Information
  • Department of Advanced Management Sciences, Higher Institute of Advanced Management Sciences and Computers, Al-Buhayrah, Egypt

Cite This Article
  • APA Style

    Alaa Ahmed Abd Elmegaly. (2019). A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap. International Journal of Theoretical and Applied Mathematics, 5(2), 21-30. https://doi.org/10.11648/j.ijtam.20190502.11

    Copy | Download

    ACS Style

    Alaa Ahmed Abd Elmegaly. A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap. Int. J. Theor. Appl. Math. 2019, 5(2), 21-30. doi: 10.11648/j.ijtam.20190502.11

    Copy | Download

    AMA Style

    Alaa Ahmed Abd Elmegaly. A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap. Int J Theor Appl Math. 2019;5(2):21-30. doi: 10.11648/j.ijtam.20190502.11

    Copy | Download

  • @article{10.11648/j.ijtam.20190502.11,
      author = {Alaa Ahmed Abd Elmegaly},
      title = {A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap},
      journal = {International Journal of Theoretical and Applied Mathematics},
      volume = {5},
      number = {2},
      pages = {21-30},
      doi = {10.11648/j.ijtam.20190502.11},
      url = {https://doi.org/10.11648/j.ijtam.20190502.11},
      eprint = {https://download.sciencepg.com/pdf/10.11648.j.ijtam.20190502.11},
      abstract = {Linear model (LM) provide the advance in regression analysis, where it was considered an important statistical development of the last fifty years, following general linear model (GLM), principal component analysis (PCA) and constrained principal component analysis (CPCA) in the last thirty years. This paper introduce a series of papers prepared within the framework of an international workshop. Firstly, the LM and GLM has been discussed. Next, an overview of PCA has been presented. Then constrained principal component has been shown. Some of its special cases such as PCA, Canonical correlation analysis (CANO), Redundancy analysis (RA), Correspondence analysis (CA), Growth curve models (GCM), Extended growth curve models (ExGCM), Canonical discriminant analysis (CDA), Constrained correspondence analysis, non-symmetric correspondence analysis, Multiple Set CANO, Multiple Correspondence Analysis, Vector Preference Models, Seemingly unrelated regression (SUR), Weighted low rank approximations, Two-Way canonical decomposition with linear constraints, and Multilevel RA has been noted in this paper. Related methods and ordinary least squares (OLS) estimator as a special case form CPCA has been introduced. Finally, an example has been introduced to indicate the importance of CPCA and the different between PCA and CPCA. Where CPCA is a method for structural analysis of multivariate data that combine features of regression analysis and principal component analysis. In this method, the original data first decomposed into several components according to external information. The components then subjected to principal component analysis to explore structures within the components.},
     year = {2019}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - A Review of Constrained Principal Component Analysis (CPCA) with Application on Bootstrap
    AU  - Alaa Ahmed Abd Elmegaly
    Y1  - 2019/09/10
    PY  - 2019
    N1  - https://doi.org/10.11648/j.ijtam.20190502.11
    DO  - 10.11648/j.ijtam.20190502.11
    T2  - International Journal of Theoretical and Applied Mathematics
    JF  - International Journal of Theoretical and Applied Mathematics
    JO  - International Journal of Theoretical and Applied Mathematics
    SP  - 21
    EP  - 30
    PB  - Science Publishing Group
    SN  - 2575-5080
    UR  - https://doi.org/10.11648/j.ijtam.20190502.11
    AB  - Linear model (LM) provide the advance in regression analysis, where it was considered an important statistical development of the last fifty years, following general linear model (GLM), principal component analysis (PCA) and constrained principal component analysis (CPCA) in the last thirty years. This paper introduce a series of papers prepared within the framework of an international workshop. Firstly, the LM and GLM has been discussed. Next, an overview of PCA has been presented. Then constrained principal component has been shown. Some of its special cases such as PCA, Canonical correlation analysis (CANO), Redundancy analysis (RA), Correspondence analysis (CA), Growth curve models (GCM), Extended growth curve models (ExGCM), Canonical discriminant analysis (CDA), Constrained correspondence analysis, non-symmetric correspondence analysis, Multiple Set CANO, Multiple Correspondence Analysis, Vector Preference Models, Seemingly unrelated regression (SUR), Weighted low rank approximations, Two-Way canonical decomposition with linear constraints, and Multilevel RA has been noted in this paper. Related methods and ordinary least squares (OLS) estimator as a special case form CPCA has been introduced. Finally, an example has been introduced to indicate the importance of CPCA and the different between PCA and CPCA. Where CPCA is a method for structural analysis of multivariate data that combine features of regression analysis and principal component analysis. In this method, the original data first decomposed into several components according to external information. The components then subjected to principal component analysis to explore structures within the components.
    VL  - 5
    IS  - 2
    ER  - 

    Copy | Download

  • Sections