| Peer-Reviewed

Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers

Received: 17 April 2018    Accepted: 29 May 2018    Published: 29 June 2018
Views:       Downloads:
Abstract

This study proposes a regularized robust Nonlinear Least Trimmed squares estimator that relies on an Elastic net penalty in nonlinear regression. Regularization parameter selection was done using a robust cross-validation criterion and estimation through Newton Raphson iteration algorthm for the oprimal model coefficients. Monte Carlo simulation was conducted to verify the theoretical properties outlined in the methodology both for scenarios of presence and absence of multicollinearity and existence of outliers. The proposed procedure performed well compared to the NLS and NLTS in a viewpoint of yielding relatively lower values of MSE and Bias. Furthermore, a real data analysis demonstrated satisfactory performance of the suggested technique.

Published in American Journal of Theoretical and Applied Statistics (Volume 7, Issue 4)
DOI 10.11648/j.ajtas.20180704.14
Page(s) 156-162
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Elastic Net, Multicollinearity, Regularization, Nonlinear Least Trimmed Squares, Outliers

References
[1] Ando, T., Konishi, S., Imoto, S. (2008). Nonlinear regression modeling via regularized radial basis function networks, Journal of Statistical Planning and Inference, Volume 138, Issue 11, Pages 3616-3633.
[2] Batterham, A., Tolfrey, K., and George, K. Nevills (1997). explanation of kleibers 0.75 mass exponent: an artifact of collinearity problems in least squares models? Journal of Applied Physiology, 82:693-697.
[3] Cizek, P. (2001). Nonlinear least trimmed squares. SFB Discussion paper Humboldt University, 25/2001.
[4] Farnoosh, R., J. Ghasemian, J., Fard, O. S. (2012). Comp. and Appl. Math. - no.2, 31, 323-338.
[5] Hashem, H. (2014). Regularized and robust regression methods for high-dimensional data. Department of Mathematical Sciences, Brunel University.
[6] Hang, R., Liu, J. J., Q., Song, H., Zhu, F., and Pei, H. (2017). Graph Regularized Nonlinear Ridge Regression for Remote Sensing Data Analysis. IEEE, 10: 277 – 285.
[7] Jiang, X., Jiang, J., and Song, X. (2012). Oracle model selection for nonlinear models based on weighted composite quantile regression. Statistica Sinica, 22:1479-1506.
[8] Kamal, D. and Ali, B. (2015). H. Robust linear regression using l1-penalized mm-estimation for high dimensional data. American Journal of Theoretical and Applied Statistics, 4(3):77-84.
[9] Khademzadeh, A., Khademzadeh, A., D, P. C. P., D, M. S. P., and Anagnostopoulos, G. C. (2013). Large-scale non-linear regression within the mapreduce framework.
[10] Lim, C. (2015). Robust ridge regression estimators for nonlinear models with applications to high throughput screening assay data. Statistics in medicine.
[11] Ohlsson, H. (2010). Regularization for sparseness and smoothness applications in system identication and signal processing. Linkoping University Electronic Press.
[12] Park, H. (2013). Robust regression modelling via l1 type regularization. Dept. of Mathematics Chuo University.
[13] Ramalho, E. and Ramalho, J. (2014). Moment-based estimation of nonlinear regression models with boundary outcomes and endogeneity, with applications to nonnegative and fractional responses. CEFAGE-UE Working Paper 2014/09.
[14] Sima, D. (Apr. 2006). Regularization techniques in modeling and parameter estimation. PhD thesis.
[15] Tabatabai, M. A., Kengwoung-Keumo, J. J., Eby, W. M., Bae, S., and Manne, U. (2014). A new robust method for nonlinear regression. J Biomet Biostat, 5.
[16] Tateishi, S., Matsui, H., and Konishi, S. (2009). Nonlinear regression via the lasso-type regularization. Journal of statistical planning and inference.
[17] Tikhonov, A. N. (1943). On the stability of inverse problems. Dokl. Akad. Nauk SSSR, 39:176-179.
[18] Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society, B67: 301-320.
[19] Zucker, D. M., Gorfine, M., Li, Y., Tadesse, M., & Spiegelman, D. (2013). A Regularization Corrected Score Method for Nonlinear Regression Models with Covariate Error. Biometrics, 69(1), 80–90.
Cite This Article
  • APA Style

    George Kemboi Kirui Keitany, Ananda Omutokoh Kube, Joseph Mutua Mutisya, Fundi Daniel Muriithi. (2018). Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers. American Journal of Theoretical and Applied Statistics, 7(4), 156-162. https://doi.org/10.11648/j.ajtas.20180704.14

    Copy | Download

    ACS Style

    George Kemboi Kirui Keitany; Ananda Omutokoh Kube; Joseph Mutua Mutisya; Fundi Daniel Muriithi. Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers. Am. J. Theor. Appl. Stat. 2018, 7(4), 156-162. doi: 10.11648/j.ajtas.20180704.14

    Copy | Download

    AMA Style

    George Kemboi Kirui Keitany, Ananda Omutokoh Kube, Joseph Mutua Mutisya, Fundi Daniel Muriithi. Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers. Am J Theor Appl Stat. 2018;7(4):156-162. doi: 10.11648/j.ajtas.20180704.14

    Copy | Download

  • @article{10.11648/j.ajtas.20180704.14,
      author = {George Kemboi Kirui Keitany and Ananda Omutokoh Kube and Joseph Mutua Mutisya and Fundi Daniel Muriithi},
      title = {Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers},
      journal = {American Journal of Theoretical and Applied Statistics},
      volume = {7},
      number = {4},
      pages = {156-162},
      doi = {10.11648/j.ajtas.20180704.14},
      url = {https://doi.org/10.11648/j.ajtas.20180704.14},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20180704.14},
      abstract = {This study proposes a regularized robust Nonlinear Least Trimmed squares estimator that relies on an Elastic net penalty in nonlinear regression. Regularization parameter selection was done using a robust cross-validation criterion and estimation through Newton Raphson iteration algorthm for the oprimal model coefficients. Monte Carlo simulation was conducted to verify the theoretical properties outlined in the methodology both for scenarios of presence and absence of multicollinearity and existence of outliers. The proposed procedure performed well compared to the NLS and NLTS in a viewpoint of yielding relatively lower values of MSE and Bias. Furthermore, a real data analysis demonstrated satisfactory performance of the suggested technique.},
     year = {2018}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers
    AU  - George Kemboi Kirui Keitany
    AU  - Ananda Omutokoh Kube
    AU  - Joseph Mutua Mutisya
    AU  - Fundi Daniel Muriithi
    Y1  - 2018/06/29
    PY  - 2018
    N1  - https://doi.org/10.11648/j.ajtas.20180704.14
    DO  - 10.11648/j.ajtas.20180704.14
    T2  - American Journal of Theoretical and Applied Statistics
    JF  - American Journal of Theoretical and Applied Statistics
    JO  - American Journal of Theoretical and Applied Statistics
    SP  - 156
    EP  - 162
    PB  - Science Publishing Group
    SN  - 2326-9006
    UR  - https://doi.org/10.11648/j.ajtas.20180704.14
    AB  - This study proposes a regularized robust Nonlinear Least Trimmed squares estimator that relies on an Elastic net penalty in nonlinear regression. Regularization parameter selection was done using a robust cross-validation criterion and estimation through Newton Raphson iteration algorthm for the oprimal model coefficients. Monte Carlo simulation was conducted to verify the theoretical properties outlined in the methodology both for scenarios of presence and absence of multicollinearity and existence of outliers. The proposed procedure performed well compared to the NLS and NLTS in a viewpoint of yielding relatively lower values of MSE and Bias. Furthermore, a real data analysis demonstrated satisfactory performance of the suggested technique.
    VL  - 7
    IS  - 4
    ER  - 

    Copy | Download

Author Information
  • Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya

  • Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya

  • Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya

  • Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya

  • Sections