| Peer-Reviewed

Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory

Received: 17 May 2018     Accepted: 14 June 2018     Published: 10 July 2018
Views:       Downloads:
Abstract

With the advent of memristors, analog artificial neural networks are closer than ever. Neural computing is growing as a topic of research. In the context of analog artificial neural networks, the purpose of this research is to verify that a perceptron could gain a discrete memory from implementing a hysteresis loop in the activation function. The discrete memory is represented by the difference path of the hysteresis activation function that took from logic 1 to logic 0. To write to the memory, the input to the hysteresis loop would have to exceed threshold. To read the stored value, the input would have to be between the thresholds of the hysteresis function. In order to verify the perceptron’s memory, a network with manually chosen weights is selected which acts as a shift register. The components of this network are assembled in a circuit simulation program. Functionally, the network receives two inputs: a data signal and an enable signal. The output of the network is a time-shifted version of previous input signals. A system whose output is a time-shifted version of the previous inputs is considered to have memory.

Published in Science Journal of Circuits, Systems and Signal Processing (Volume 7, Issue 2)
DOI 10.11648/j.cssp.20180702.14
Page(s) 68-73
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2018. Published by Science Publishing Group

Keywords

Artificial Neural Network, Recurrent Neural Network, Memristors, Hysteresis Loop Activation Function, Analog Computing, Neural Computing, Long Short-Term Memory

References
[1] D. O. Hebb, The organization of Behavior: A Neuropsychological Theory. First Edition, New York, 2012.
[2] K. Anjaneyulu, “Deep Blue beats Kasparov in a rematch,” Resonance, vol. 2, no. 7, pp. 89-90, 1997.
[3] W. Dean, "Computational Complexity Theory (Stanford Encyclopedia of Philosophy)", Plato.stanford.edu, 2017.
[4] D. Nassimi and S. Sahni, "Bitonic Sort on a Mesh-Connected Parallel Computer", IEEE Transactions on Computers, vol. C-27, no. 1, pp. 2-7, 1979.
[5] L. Chua, "Memristor-The missing circuit element," IEEE Transactions on Circuit Theory, vol. 18, no. 5, pp. 507-519, 1971.
[6] D. Strukov, G. Snider, D. Stewart, and R. Williams, “The missing memristor found,” Nature, vol. 453, pp. 80-83, 2008.
[7] H. Kim, M. Sah, C. Yang, S. Cho and L. Chua, “Memristor Emulator for Memristor Circuit Applications,” IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 59, no. 10, pp. 2422-2431, 2012.
[8] Q. Li, A. Serb, T. Prodromakis, and H. Xu, “A Memristor SPICE Model Accounting for Synaptic Activity Dependence,” PLOS ONE, vol. 10, no. 3, pp. 1-12, 2015.
[9] M. Kumar, “Memristor - Why Do We Have to Know About It?” IETE Technical Review, vol. 26, no. 1, pp. 1-6, 2009.
[10] R. Williams, "How We Found The Missing Memristor", IEEE Spectrum, vol. 45, no. 12, pp. 28-35, 2008.
[11] C. Yakopcic, R. Hasan and T. Taha, “Hybrid crossbar architecture for a memristor based cache,” Microelectronics Journal, vol. 46, no. 11, pp. 1020-1032, 2015.
[12] M. Sah, C. Yang, H. Kim and L. Chua, “A Voltage Mode Memristor Bridge Synaptic Circuit with Memristor Emulators,” Sensors, vol. 12, no. 12, pp. 3587-3604, 2012.
[13] S. Adhikari, H. Kim, R. Budhathoki, C. Yang and L. Chua, “A Circuit-Based Learning Architecture for Multilayer Neural Networks With Memristor Bridge Synapses,” IEEE Transactions on Circuits and Systems I: Regular Papers, vol. 62, no. 1, pp. 215-223, 2015.
[14] J. Capulong, B. Briggs, S. Bishop, M. Hovish, R. Matyi and N. Cady, “Effect of Crystallinity on Endurance and Switching and Behavior on HfOx-based Resistive Memory Devices,” IEEE International Integrated Reliability Workshop Final Report (IRW), South Lake Tahoe, CA, USA, pp. 22-25, 14-18 October 2012.
[15] Z. Chew and L. Li, “A discrete memristor made of ZnO nanowires synthesized on printed circuit board,” Materials Letters, vol. 91, pp. 298-300, 2013.
[16] N. Duraisamy, N. Muhammad, H. Kim, J. Jo and K. Choi, “Fabrication of TiO2 thin film memristor device using electrohydrodynamic inkjet printing,” Thin Solid Films, vol. 520, no. 15, pp. 5070-5074, 2012.
[17] N. Mou and M. Tabib-Azar, “Photoreduction of Ag+ in Ag/Ag2S/Au memristor,” Applied Surface Science, vol. 340, pp. 138-142, 2015.
[18] M. Hu, H. Li, Y. Chen, Q. Wu, G. Rose and R. Linderman, “Memristor Crossbar-Based Neuromorphic Computing System: A Case Study,” IEEE Transactions on Neural Networks and Learning Systems, vol. 25, no. 10, pp. 1864-1878, 2014.
[19] W. Wang, L. Li, H. Peng, J. Xiao and Y. Yang, “Synchronization control of memristor-based recurrent neural networks with perturbations,” Neural Networks, vol. 53, pp. 8-14, 2014.
[20] S. Hochreiter and J. Schmidhuber, “Long Short-Term Memory,” Neural Computation, vol. 9, no. 8, pp. 1735-1780, 1997.
[21] G. Zhou, J. Wu, C. Zhang and Z. Zhou, “Minimal gated unit for recurrent neural networks,” International Journal of Automation and Computing, vol. 13, no. 3, pp. 226-234, 2016.
[22] J. Hopfield, “The effectiveness of analogue ‘neural network’ hardware,” Network: Computation in Neural Systems, vol. 1, no. 1, pp. 27-40, 1990.
[23] O. Law and C. Salama, “GaAs Schmitt trigger memory cell design,” IEEE Journal of Solid-State Circuits, vol. 31, no. 8, pp. 1190-1192, 1996.
[24] D. Dong and J. Hopfield, “Dynamic properties of neural networks with adapting synapses,” Network: Computation in Neural Systems, vol. 3, no. 3, pp. 267-283, 1992.
[25] F. Beaufays, “The neural networks behind Google Voice transcription,” Google Research Blog, 11 August 2015.
[26] W. Li and M. Nordahl, “Transient behavior of cellular automaton rule 110,” Physics Letters A, vol. 166, no. 5-6, pp. 335-339, 1992.
[27] F. Berto and J. Tagliabue, “Cellular Automata-The Stanford Encyclopedia of Philosophy”, Plato.stanford.edu, 2017.
[28] K. Swingler, “Lecture 2: Single Layer Perceptrons,” University of Stirling, Scotland, UK, 2017.
Cite This Article
  • APA Style

    William Brickner, Muhammad Sana Ullah. (2018). Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory. Science Journal of Circuits, Systems and Signal Processing, 7(2), 68-73. https://doi.org/10.11648/j.cssp.20180702.14

    Copy | Download

    ACS Style

    William Brickner; Muhammad Sana Ullah. Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory. Sci. J. Circuits Syst. Signal Process. 2018, 7(2), 68-73. doi: 10.11648/j.cssp.20180702.14

    Copy | Download

    AMA Style

    William Brickner, Muhammad Sana Ullah. Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory. Sci J Circuits Syst Signal Process. 2018;7(2):68-73. doi: 10.11648/j.cssp.20180702.14

    Copy | Download

  • @article{10.11648/j.cssp.20180702.14,
      author = {William Brickner and Muhammad Sana Ullah},
      title = {Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory},
      journal = {Science Journal of Circuits, Systems and Signal Processing},
      volume = {7},
      number = {2},
      pages = {68-73},
      doi = {10.11648/j.cssp.20180702.14},
      url = {https://doi.org/10.11648/j.cssp.20180702.14},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.cssp.20180702.14},
      abstract = {With the advent of memristors, analog artificial neural networks are closer than ever. Neural computing is growing as a topic of research. In the context of analog artificial neural networks, the purpose of this research is to verify that a perceptron could gain a discrete memory from implementing a hysteresis loop in the activation function. The discrete memory is represented by the difference path of the hysteresis activation function that took from logic 1 to logic 0. To write to the memory, the input to the hysteresis loop would have to exceed threshold. To read the stored value, the input would have to be between the thresholds of the hysteresis function. In order to verify the perceptron’s memory, a network with manually chosen weights is selected which acts as a shift register. The components of this network are assembled in a circuit simulation program. Functionally, the network receives two inputs: a data signal and an enable signal. The output of the network is a time-shifted version of previous input signals. A system whose output is a time-shifted version of the previous inputs is considered to have memory.},
     year = {2018}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Use of a Hysteresis Loop Activation Function to Enable an Analog Perceptron to Gain Memory
    AU  - William Brickner
    AU  - Muhammad Sana Ullah
    Y1  - 2018/07/10
    PY  - 2018
    N1  - https://doi.org/10.11648/j.cssp.20180702.14
    DO  - 10.11648/j.cssp.20180702.14
    T2  - Science Journal of Circuits, Systems and Signal Processing
    JF  - Science Journal of Circuits, Systems and Signal Processing
    JO  - Science Journal of Circuits, Systems and Signal Processing
    SP  - 68
    EP  - 73
    PB  - Science Publishing Group
    SN  - 2326-9073
    UR  - https://doi.org/10.11648/j.cssp.20180702.14
    AB  - With the advent of memristors, analog artificial neural networks are closer than ever. Neural computing is growing as a topic of research. In the context of analog artificial neural networks, the purpose of this research is to verify that a perceptron could gain a discrete memory from implementing a hysteresis loop in the activation function. The discrete memory is represented by the difference path of the hysteresis activation function that took from logic 1 to logic 0. To write to the memory, the input to the hysteresis loop would have to exceed threshold. To read the stored value, the input would have to be between the thresholds of the hysteresis function. In order to verify the perceptron’s memory, a network with manually chosen weights is selected which acts as a shift register. The components of this network are assembled in a circuit simulation program. Functionally, the network receives two inputs: a data signal and an enable signal. The output of the network is a time-shifted version of previous input signals. A system whose output is a time-shifted version of the previous inputs is considered to have memory.
    VL  - 7
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • Department of Electrical and Computer Engineering, Florida Polytechnic University, Lakeland, USA

  • Department of Electrical and Computer Engineering, Florida Polytechnic University, Lakeland, USA

  • Sections