Acceleration of back propagation through initial weight pre-training with delta rule
Conference Paper
Overview
Identity
Additional Document Info
View All
Overview
abstract
A new training strategy for Back Propagation (BP) neural networks, named Delta Pre-Training (DPT), is proposed. The core of the new training strategy is based on pre-training the initial weights for BP networks using the Delta rule, instead of using random values. After pre-training, the normal BP training procedure is carried out to complete network training. With the DPT, the convergence rate for training BP networks can be significantly improved. With regards to on-chip learning in VLSI implementations, only a little additional circuitry is required for the pre-training phase with the DPT. Simulation results using the proposed training method show its superiority over previous methods.