Low power/minimum transistor building blocks for the implementation of back-propagation algorithms
Conference Paper
Overview
Additional Document Info
View All
Overview
abstract
Several building blocks intended for on-chip learning neural networks are proposed. The current based elements are adders, multipliers, activation functions and their derivatives. The priorities for the design are both minimum power consumption and minimum silicon area. Simulated results for two networks are reported.