| Peer-Reviewed

An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images

Received: 4 March 2022    Accepted: 24 March 2022    Published: 31 March 2022
Views:       Downloads:
Abstract

Although Binary Relevance (BR) is an adaptive and conceptually simple multi-label learning technique, its inability to exploit label dependencies and other inherent problems in multi-label examples makes it difficult to generalize well in the classification of real-world multi-label examples like annotated images. Thus, to strengthen the generalization ability of Binary Relevance, this study used Multi-label Linear Discriminant Analysis (MLDA) as a preprocessing technique to take care of the label dependencies, the curse of dimensionality, and label over counting inherent in multi-labeled images. After that, Binary Relevance with K Nearest Neighbor as the base learner was fitted and its classification performance was evaluated on randomly selected 1000 images with a label cardinality of 2.149 of the five most frequent categories, namely; "person", "chair", "bottle", "dining table" and "cup" in the Microsoft Common Objects in Context 2017 (MS COCO 2017) dataset. Experimental results showed that micro averages of precision, recall, and f1-score of Multi-label Linear Discriminant Analysis followed by Binary Relevance K Nearest Neighbor (MLDA-BRKNN) achieved a more than 30% improvement in classification of the 1000 annotated images in the dataset when compared with the micro averages of precision, recall, and f1-score of Binary Relevance K Nearest Neighbor (BRKNN), which was used as the reference classifier method in this study.

Published in International Journal of Data Science and Analysis (Volume 8, Issue 2)
DOI 10.11648/j.ijdsa.20220802.13
Page(s) 30-37
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Binary Relevance, K-Nearest Neighbor, Binary Relevance K-Nearest Neighbor (BRKNN), Multi-label Linear Discriminant Analysis (MLDA)

References
[1] B. M. Brhanie, Multi-Label Classification Methods for Image Annotation, 2016.
[2] H. Wang, C. Ding and H. Huang, "Multi-label linear discriminant analysis," in European conference on computer vision, 2010.
[3] M. S. Sorower, "A literature survey on algorithms for multi-label learning," Oregon State University, Corvallis, vol. 18, p. 1–25, 2010.
[4] M.-L. Zhang, Y.-K. Li, X.-Y. Liu and X. Geng, "Binary relevance for multi-label learning: an overview," Frontiers of Computer Science, vol. 12, p. 191–202, 2018.
[5] T.-Y. Lin, M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár and C. L. Zitnick, "Microsoft coco: Common objects in context," in European conference on computer vision, 2014.
[6] J. M. Nareshpalsingh and H. N. Modi, "Multi-label classification methods: a comparative study," International Research Journal of Engineering and Technology (IRJET), vol. 4, p. 263–270, 2017.
[7] L. Sun, S. Ji and J. Ye, Multi-label dimensionality reduction, Chapman and Hall/CRC, 2019.
[8] A. Santos, A. Canuto and A. F. Neto, "A comparative analysis of classification methods to multi-label tasks in different application domains," Int. J. Comput. Inform. Syst. Indust. Manag. Appl, vol. 3, p. 218–227, 2011.
[9] A. Aldrees and A. Chikh, "Comparative evaluation of four multi-label classification algorithms in classifying learning objects," Computer Applications in Engineering Education, vol. 24, p. 651–660, 2016.
[10] H. Wang, L. Yan, H. Huang and C. Ding, "From protein sequence to protein function via multi-label linear discriminant analysis," IEEE/ACM transactions on computational biology and bioinformatics, vol. 14, p. 503–513, 2016.
[11] J. Read, B. Pfahringer, G. Holmes and E. Frank, "Classifier chains for multi-label classification," in Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 2009.
[12] G. Tsoumakas and I. Katakis, "Multi-label classification: An overview," International Journal of Data Warehousing and Mining (IJDWM), vol. 3, p. 1–13, 2007.
[13] E. Spyromitros, G. Tsoumakas and I. Vlahavas, "An empirical study of lazy multilabel classification algorithms," in Hellenic conference on artificial intelligence, 2008.
[14] N. Spolaôr, E. A. Cherman, M. C. Monard and H. D. Lee, "A comparison of multi-label feature selection methods using the problem transformation approach," Electronic Notes in Theoretical Computer Science, vol. 292, p. 135–151, 2013.
[15] E. A. Cherman, N. Spolaôr, J. Valverde-Rebaza and M. C. Monard, "Lazy multi-label learning algorithms based on mutuality strategies," Journal of Intelligent & Robotic Systems, vol. 80, p. 261–276, 2015.
[16] M. R. Boutell, J. Luo, X. Shen and C. M. Brown, "Learning multi-label scene classification," Pattern recognition, vol. 37, p. 1757–1771, 2004.
[17] M.-L. Zhang and Z.-H. Zhou, "A review on multi-label learning algorithms," IEEE transactions on knowledge and data engineering, vol. 26, p. 1819–1837, 2013.
[18] R. Cabral, F. De la Torre, J. P. Costeira and A. Bernardino, "Matrix completion for weakly-supervised multi-label image classification," IEEE transactions on pattern analysis and machine intelligence, vol. 37, p. 121–135, 2014.
[19] X. Wu, V. Kumar, J. R. Quinlan, J. Ghosh, Q. Yang, H. Motoda, G. J. McLachlan, A. Ng, B. Liu, S. Y. Philip and others, "Top 10 algorithms in data mining," Knowledge and information systems, vol. 14, p. 1–37, 2008.
[20] J. L. Villa Medina and others, "Reliability of classification and prediction in k-nearest neighbours," 2013.
[21] M.-L. Zhang and K. Zhang, "Multi-label learning by exploiting label dependency," in Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining, 2010.
[22] X. Zhu, S. Zhang, Z. Jin, Z. Zhang and Z. Xu, "Missing value estimation for mixed-attribute data sets," IEEE Transactions on Knowledge and Data Engineering, vol. 23, p. 110–121, 2010.
[23] R. Short and K. Fukunaga, "The optimal distance measure for nearest neighbor classification," IEEE transactions on Information Theory, vol. 27, p. 622–627, 1981.
[24] K. Fukunaga and L. Hostetler, "Optimization of k nearest neighbor density estimates," IEEE Transactions on Information Theory, vol. 19, p. 320–326, 1973.
[25] T. Joachims, Learning to classify text using support vector machines, vol. 668, Springer Science & Business Media, 2002.
Cite This Article
  • APA Style

    Festus Malombe Mwinzi, Thomas Mageto, Victor Muthama. (2022). An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images. International Journal of Data Science and Analysis, 8(2), 30-37. https://doi.org/10.11648/j.ijdsa.20220802.13

    Copy | Download

    ACS Style

    Festus Malombe Mwinzi; Thomas Mageto; Victor Muthama. An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images. Int. J. Data Sci. Anal. 2022, 8(2), 30-37. doi: 10.11648/j.ijdsa.20220802.13

    Copy | Download

    AMA Style

    Festus Malombe Mwinzi, Thomas Mageto, Victor Muthama. An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images. Int J Data Sci Anal. 2022;8(2):30-37. doi: 10.11648/j.ijdsa.20220802.13

    Copy | Download

  • @article{10.11648/j.ijdsa.20220802.13,
      author = {Festus Malombe Mwinzi and Thomas Mageto and Victor Muthama},
      title = {An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images},
      journal = {International Journal of Data Science and Analysis},
      volume = {8},
      number = {2},
      pages = {30-37},
      doi = {10.11648/j.ijdsa.20220802.13},
      url = {https://doi.org/10.11648/j.ijdsa.20220802.13},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ijdsa.20220802.13},
      abstract = {Although Binary Relevance (BR) is an adaptive and conceptually simple multi-label learning technique, its inability to exploit label dependencies and other inherent problems in multi-label examples makes it difficult to generalize well in the classification of real-world multi-label examples like annotated images. Thus, to strengthen the generalization ability of Binary Relevance, this study used Multi-label Linear Discriminant Analysis (MLDA) as a preprocessing technique to take care of the label dependencies, the curse of dimensionality, and label over counting inherent in multi-labeled images. After that, Binary Relevance with K Nearest Neighbor as the base learner was fitted and its classification performance was evaluated on randomly selected 1000 images with a label cardinality of 2.149 of the five most frequent categories, namely; "person", "chair", "bottle", "dining table" and "cup" in the Microsoft Common Objects in Context 2017 (MS COCO 2017) dataset. Experimental results showed that micro averages of precision, recall, and f1-score of Multi-label Linear Discriminant Analysis followed by Binary Relevance K Nearest Neighbor (MLDA-BRKNN) achieved a more than 30% improvement in classification of the 1000 annotated images in the dataset when compared with the micro averages of precision, recall, and f1-score of Binary Relevance K Nearest Neighbor (BRKNN), which was used as the reference classifier method in this study.},
     year = {2022}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - An Application of Multi-label Linear Discriminant Analysis and Binary Relevance K-Nearest Neighbor in Multi-label Classification of Annotated Images
    AU  - Festus Malombe Mwinzi
    AU  - Thomas Mageto
    AU  - Victor Muthama
    Y1  - 2022/03/31
    PY  - 2022
    N1  - https://doi.org/10.11648/j.ijdsa.20220802.13
    DO  - 10.11648/j.ijdsa.20220802.13
    T2  - International Journal of Data Science and Analysis
    JF  - International Journal of Data Science and Analysis
    JO  - International Journal of Data Science and Analysis
    SP  - 30
    EP  - 37
    PB  - Science Publishing Group
    SN  - 2575-1891
    UR  - https://doi.org/10.11648/j.ijdsa.20220802.13
    AB  - Although Binary Relevance (BR) is an adaptive and conceptually simple multi-label learning technique, its inability to exploit label dependencies and other inherent problems in multi-label examples makes it difficult to generalize well in the classification of real-world multi-label examples like annotated images. Thus, to strengthen the generalization ability of Binary Relevance, this study used Multi-label Linear Discriminant Analysis (MLDA) as a preprocessing technique to take care of the label dependencies, the curse of dimensionality, and label over counting inherent in multi-labeled images. After that, Binary Relevance with K Nearest Neighbor as the base learner was fitted and its classification performance was evaluated on randomly selected 1000 images with a label cardinality of 2.149 of the five most frequent categories, namely; "person", "chair", "bottle", "dining table" and "cup" in the Microsoft Common Objects in Context 2017 (MS COCO 2017) dataset. Experimental results showed that micro averages of precision, recall, and f1-score of Multi-label Linear Discriminant Analysis followed by Binary Relevance K Nearest Neighbor (MLDA-BRKNN) achieved a more than 30% improvement in classification of the 1000 annotated images in the dataset when compared with the micro averages of precision, recall, and f1-score of Binary Relevance K Nearest Neighbor (BRKNN), which was used as the reference classifier method in this study.
    VL  - 8
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • Department of Statistics and Actuarial Sciences, Jomo Kenyatta University of Agriculture and Technology, Nairobi, Kenya

  • Department of Statistics and Actuarial Sciences, Jomo Kenyatta University of Agriculture and Technology, Nairobi, Kenya

  • School of Pure and Applied Sciences, Kirinyaga University, Kirinyaga, Kenya

  • Sections