| Peer-Reviewed

Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network

Received: 13 July 2014    Accepted: 28 July 2014    Published: 10 August 2014
Views:       Downloads:
Abstract

In a complex and changing a remote sensing system, which requires taking quick and informed decisions environment, connectionist methods have shown their great contribution in particular the reduction and classification of spectral data. In this context, this paper proposes to study the parameters that optimize the results of an artificial neural network ANN multilayer perceptron based, for classification of chemical agents on multi-spectral images. The mean squared error cost function remains one of the major parameters of the network convergence at its learning phase and a challenge that will face our approach to improve the gradient descent by the conjugate gradient method that seems fast and efficient.

Published in American Journal of Physics and Applications (Volume 2, Issue 4)
DOI 10.11648/j.ajpa.20140204.11
Page(s) 88-94
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Optimizing, Artificial Neural Networks, Classification, Identification, Conjugate Gradient, Multi-Layer Perceptron, Back Propagation of the Gradient

References
[1] M. JANATI IDRISS and All, Reducing the number of channels of multi-spectral images by connectionist approach, Signal Processing 17, 2000, pp 491-500.
[2] A. Guerin, J.H. Crasy, Reconfigurable computing architecture for simulating networks of neurons, Review Signal Processing 5 (3), 1988, pp 178-186.
[3] J. Proriol, MLP: Network program multi-layer neurons, Journal of Modulad, 1996, pp 24-28.
[4] S. Lahmiri, A comparative study of back-propagation algorithms in financial prediction, International Journal of Computer Science, Engineering and Applications (IJCSEA), Vol.1, No.4, 2011, pp 15-21.
[5] L. N. M. Tawfiq Improving Gradient Descent Method for Training Feed Forward Neural Networks, International Journal of Modern Computer Science & Engineering, 2(1), 2013, pp 12-24.
[6] M. F. MEILLER, A Scaled Conjugate Gradient Algorithm for Fast Supervised Learning, Neural Networks, Vol.6, 1993, pp. 525-533.
[7] M. Wilamowski, and Hao Yu Improved Computation for Levenberg–Marquardt Training, IEEE transactions on neural networks, vol. 21, no. 6, 2010, pp 930-93.
[8] N. Qian1, On the momentum term in gradient descent learning algorithms, Neural Networks, 1999, pp 145-151.
[9] R. S. Ransing and N. M. Nawi1, An improved conjugate gradient based learning algorithm for back propagation neural networks, World Academy of Science, Engineering and Technology, Vol 2, 2008, pp 06-26.
[10] E.P. van Someren, L.F.A. Wessels, E. Backer, M.J.T. Reinders, Multi-criterion optimization for genetic network modeling, Signal Processing 83, 2003, pp 763-775.
Cite This Article
  • APA Style

    Said El Yamani, Samir Zeriouh, Mustapha Boutahri, Ahmed Roukhe. (2014). Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network. American Journal of Physics and Applications, 2(4), 88-94. https://doi.org/10.11648/j.ajpa.20140204.11

    Copy | Download

    ACS Style

    Said El Yamani; Samir Zeriouh; Mustapha Boutahri; Ahmed Roukhe. Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network. Am. J. Phys. Appl. 2014, 2(4), 88-94. doi: 10.11648/j.ajpa.20140204.11

    Copy | Download

    AMA Style

    Said El Yamani, Samir Zeriouh, Mustapha Boutahri, Ahmed Roukhe. Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network. Am J Phys Appl. 2014;2(4):88-94. doi: 10.11648/j.ajpa.20140204.11

    Copy | Download

  • @article{10.11648/j.ajpa.20140204.11,
      author = {Said El Yamani and Samir Zeriouh and Mustapha Boutahri and Ahmed Roukhe},
      title = {Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network},
      journal = {American Journal of Physics and Applications},
      volume = {2},
      number = {4},
      pages = {88-94},
      doi = {10.11648/j.ajpa.20140204.11},
      url = {https://doi.org/10.11648/j.ajpa.20140204.11},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajpa.20140204.11},
      abstract = {In a complex and changing a remote sensing system, which requires taking quick and informed decisions environment, connectionist methods have shown their great contribution in particular the reduction and classification of spectral data. In this context, this paper proposes to study the parameters that optimize the results of an artificial neural network ANN multilayer perceptron based, for classification of chemical agents on multi-spectral images. The mean squared error cost function remains one of the major parameters of the network convergence at its learning phase and a challenge that will face our approach to improve the gradient descent by the conjugate gradient method that seems fast and efficient.},
     year = {2014}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Optimizing Back-Propagation Gradient for Classification by an Artificial Neural Network
    AU  - Said El Yamani
    AU  - Samir Zeriouh
    AU  - Mustapha Boutahri
    AU  - Ahmed Roukhe
    Y1  - 2014/08/10
    PY  - 2014
    N1  - https://doi.org/10.11648/j.ajpa.20140204.11
    DO  - 10.11648/j.ajpa.20140204.11
    T2  - American Journal of Physics and Applications
    JF  - American Journal of Physics and Applications
    JO  - American Journal of Physics and Applications
    SP  - 88
    EP  - 94
    PB  - Science Publishing Group
    SN  - 2330-4308
    UR  - https://doi.org/10.11648/j.ajpa.20140204.11
    AB  - In a complex and changing a remote sensing system, which requires taking quick and informed decisions environment, connectionist methods have shown their great contribution in particular the reduction and classification of spectral data. In this context, this paper proposes to study the parameters that optimize the results of an artificial neural network ANN multilayer perceptron based, for classification of chemical agents on multi-spectral images. The mean squared error cost function remains one of the major parameters of the network convergence at its learning phase and a challenge that will face our approach to improve the gradient descent by the conjugate gradient method that seems fast and efficient.
    VL  - 2
    IS  - 4
    ER  - 

    Copy | Download

Author Information
  • Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco

  • Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco

  • Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco

  • Optronic and Information Treatment Team, Atomic, Mechanical, Photonic and Energy Laboratory, Faculty of Science, Moulay Ismail University, B. P. 11201 Zitoune, Meknès, Morocco

  • Sections