| Peer-Reviewed

Effect of Correlation Between Abilities Under Between-Item Dimensionality

Received: 16 June 2022     Accepted: 22 July 2022     Published: 29 July 2022
Views:       Downloads:
Abstract

The Item Response Theory (IRT) evaluates the relationship between people’s ability and test items, and it includes unidimensional and multidimensional models. One key assumption for the unidimensional IRT model is that only one dimension of ability should be tested. However, since people’s abilities are latent, many datasets fitted with the unidimensional IRT model reflect abilities from more than one dimension in fact. To identify the consequence of fitting the unidimensional IRT model on correlated abilities, this research focuses on when the correlated abilities can be treated as a single ability, the possible pattern of misfit, and if it is reduced by higher correlated abilities. In the research, the misfits are evaluated by applying unidimensional 2-parameter logistic (2PL) IRT model while the datasets are simulated with items testing two different correlated. The dimensionalities are examined with abilities correlated to different degrees, and the misfit of using the unidimensional IRT model is tested by comparing the item difficulties and item discriminations from the fitted model and the true parameters. The results show that when the correlation between abilities is higher than 0.95, the unidimensional model can be fit without bias. But for all simulated datasets with correlated abilities below 0.95, the estimated item parameters using the unidimensional model are biased and the biases are not reduced with increasing correlation if multiple factors are identified for abilities.

Published in American Journal of Theoretical and Applied Statistics (Volume 11, Issue 4)
DOI 10.11648/j.ajtas.20221104.11
Page(s) 109-113
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2022. Published by Science Publishing Group

Keywords

The Item Response, People’s Ability, The IRT Model, Item Dimensionality, Probability and Statistics, The Factor Analysis

References
[1] Embretson, S. E., & Reise, S. P. (2000). Item Response Theory for Psychologists (1st ed.). Psychology Press. https://doi.org/10.4324/9781410605269
[2] Fayers, P. M., & Hays, R. (2005). Applying item response theory modeling for evaluating questionnaire item and scale properties. In: Fayers P, Hays RD, editors. Assessing quality of life in clinical trials: methods and practice, pp. 55–73. Oxford University Press.
[3] Gosz, J., K. & Walker, C. M. (2002, April). An empirical comparison of multidimensional item response data using TESTFACT and NOHARM. Paper presented at the annual meeting of the National Council for Measurement in Education, New Orleans, LA.
[4] Hambleton, R. K. & Swaminathan, H. (1985). Item response theory: principles and applications. Boston: Hingham, MA, U.S.A: Kluwer-Nijhoff Pub.
[5] Hambleton, R. K., Swaminathan, H., & Rogers, H. J. (1991). Fundamentals of item response theory (Vol. 2). Sage.
[6] Hattie, J. (1985). Methodology review: assessing unidimensionality of tests and ltems. Applied psychological measurement, 9 (2), 139-164.
[7] Immekus, J. C., & Imbrie, P. K. (2008). Dimensionality Assessment Using the Full-Information Item Bifactor Analysis for Graded Response Data: An Illustration With the State Metacognitive Inventory. Educational and Psychological Measurement, 68 (4), 695–709. https://doi.org/10.1177/0013164407313366
[8] Knol, D. L., & Berger, M. P. F. (1991). Empirical comparison between factor analysis and multidimensional item response models. Multivariate Behavioral Research, 26, 457-477.
[9] Raju, N. S., Laffitte, L. J., & Byrne, B. M. (2002). Measurement equivalence: A comparison of methods based on confirmatory factor analysis and item response theory. Journal of Applied Psychology, 87 (3), 517–529. https://doi.org/10.1037/0021-9010.87.3.517
[10] Reckase, M. D. (1979). Unifactor latent trait models applied to multifactor tests: Results and implications. Journal of educational statistics, 4 (3), 207-230.
[11] Reckase, M. D. (2009). Multidimensional Item Response Theory. New York, NY: Springer.
[12] Rindskopf, D., and Rose, T. (1988). Some theory and applications of confirmatory second-order factor analysis. Multivar. Behav. Res. 23, 51–67. doi: 10.1207/s15327906mbr2301_3.
[13] Stout, W. (1987). A nonparametric approach for assessing latent trait unidimensionality. Psychometrika, 52 (4), 589-617.
[14] Svetina, D., Valdivia, A., Underhill, S., Dai, S., & Wang, X. (2017). Parameter Recovery in Multidimensional Item Response Theory Models Under Complexity and Nonnormality. Applied psychological measurement, 41 (7), 530–544. https://doi.org/10.1177/0146621617707507
[15] Van der Linden, W. J., & Hambleton, R. K. (Eds.). (1997). Handbook of modern item response theory. New York, NY: Springer.
[16] Wu, M., Tam, H. P., & Jen, T. H. (2016). Educational measurement for applied researchers. Theory into practice, 136.
Cite This Article
  • APA Style

    Xiong Rao. (2022). Effect of Correlation Between Abilities Under Between-Item Dimensionality. American Journal of Theoretical and Applied Statistics, 11(4), 109-113. https://doi.org/10.11648/j.ajtas.20221104.11

    Copy | Download

    ACS Style

    Xiong Rao. Effect of Correlation Between Abilities Under Between-Item Dimensionality. Am. J. Theor. Appl. Stat. 2022, 11(4), 109-113. doi: 10.11648/j.ajtas.20221104.11

    Copy | Download

    AMA Style

    Xiong Rao. Effect of Correlation Between Abilities Under Between-Item Dimensionality. Am J Theor Appl Stat. 2022;11(4):109-113. doi: 10.11648/j.ajtas.20221104.11

    Copy | Download

  • @article{10.11648/j.ajtas.20221104.11,
      author = {Xiong Rao},
      title = {Effect of Correlation Between Abilities Under Between-Item Dimensionality},
      journal = {American Journal of Theoretical and Applied Statistics},
      volume = {11},
      number = {4},
      pages = {109-113},
      doi = {10.11648/j.ajtas.20221104.11},
      url = {https://doi.org/10.11648/j.ajtas.20221104.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20221104.11},
      abstract = {The Item Response Theory (IRT) evaluates the relationship between people’s ability and test items, and it includes unidimensional and multidimensional models. One key assumption for the unidimensional IRT model is that only one dimension of ability should be tested. However, since people’s abilities are latent, many datasets fitted with the unidimensional IRT model reflect abilities from more than one dimension in fact. To identify the consequence of fitting the unidimensional IRT model on correlated abilities, this research focuses on when the correlated abilities can be treated as a single ability, the possible pattern of misfit, and if it is reduced by higher correlated abilities. In the research, the misfits are evaluated by applying unidimensional 2-parameter logistic (2PL) IRT model while the datasets are simulated with items testing two different correlated. The dimensionalities are examined with abilities correlated to different degrees, and the misfit of using the unidimensional IRT model is tested by comparing the item difficulties and item discriminations from the fitted model and the true parameters. The results show that when the correlation between abilities is higher than 0.95, the unidimensional model can be fit without bias. But for all simulated datasets with correlated abilities below 0.95, the estimated item parameters using the unidimensional model are biased and the biases are not reduced with increasing correlation if multiple factors are identified for abilities.},
     year = {2022}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Effect of Correlation Between Abilities Under Between-Item Dimensionality
    AU  - Xiong Rao
    Y1  - 2022/07/29
    PY  - 2022
    N1  - https://doi.org/10.11648/j.ajtas.20221104.11
    DO  - 10.11648/j.ajtas.20221104.11
    T2  - American Journal of Theoretical and Applied Statistics
    JF  - American Journal of Theoretical and Applied Statistics
    JO  - American Journal of Theoretical and Applied Statistics
    SP  - 109
    EP  - 113
    PB  - Science Publishing Group
    SN  - 2326-9006
    UR  - https://doi.org/10.11648/j.ajtas.20221104.11
    AB  - The Item Response Theory (IRT) evaluates the relationship between people’s ability and test items, and it includes unidimensional and multidimensional models. One key assumption for the unidimensional IRT model is that only one dimension of ability should be tested. However, since people’s abilities are latent, many datasets fitted with the unidimensional IRT model reflect abilities from more than one dimension in fact. To identify the consequence of fitting the unidimensional IRT model on correlated abilities, this research focuses on when the correlated abilities can be treated as a single ability, the possible pattern of misfit, and if it is reduced by higher correlated abilities. In the research, the misfits are evaluated by applying unidimensional 2-parameter logistic (2PL) IRT model while the datasets are simulated with items testing two different correlated. The dimensionalities are examined with abilities correlated to different degrees, and the misfit of using the unidimensional IRT model is tested by comparing the item difficulties and item discriminations from the fitted model and the true parameters. The results show that when the correlation between abilities is higher than 0.95, the unidimensional model can be fit without bias. But for all simulated datasets with correlated abilities below 0.95, the estimated item parameters using the unidimensional model are biased and the biases are not reduced with increasing correlation if multiple factors are identified for abilities.
    VL  - 11
    IS  - 4
    ER  - 

    Copy | Download

Author Information
  • Department of Statistics and Data Science, University of Arizona, Tucson, United State

  • Sections