Adjusting Learning Rate of Memristor-Based Multilayer Neural Networks via Fuzzy Method Academic Article uri icon


  • 1982-2012 IEEE. Back propagation (BP) based on stochastic gradient descent is the prevailing method to train multilayer neural networks (MNNs) with hidden layers. However, the existence of the physical separation between memory arrays and arithmetic module makes it inefficient and ineffective to implement BP in conventional digital hardware. Although CMOS may alleviate some problems of the hardware implementation of MNNs, synapses based on CMOS cost too much power and areas in very large scale integrated circuits. As a novel device, memristor shows promises to overcome this shortcoming due to its ability to closely integrate processing and memory. This paper proposes a novel circuit for implementing a synapse based on a memristor and two MOSFET tansistors (p-type and n-type). Compared with a CMOS-only circuit, the proposed one reduced the area consumption by 92%-98%. In addition, we develop a fuzzy method for the adjustment of the learning rates of MNNs, which increases the learning accuracy by 2%-3% compared with a constant learning rate. Meanwhile, the fuzzy adjustment method is robust and insensitive to parameter changes due to the approximate reasoning. Furthermore, the proposed methods can be extended to memristor-based multilayer convolutional neural network for complex tasks. The novel architecture behaves in a human-liking thinking process.

published proceedings


author list (cited authors)

  • Wen, S., Xiao, S., Yang, Y., Yan, Z., Zeng, Z., & Huang, T.

citation count

  • 110

complete list of authors

  • Wen, Shiping||Xiao, Shuixin||Yang, Yin||Yan, Zheng||Zeng, Zhigang||Huang, Tingwen

publication date

  • June 2019