How much power does neural propagation need?
Academic Article
Overview
Research
Identity
Additional Document Info
Other
View All
Overview
abstract
Two well known, biologically inspired non-dynamical models of stochastic resonance, the threshold-crossing model and the fluctuating rate model, are analyzed in terms of channel information capacity and dissipation of energy necessary for small-signal transduction. Using analogies to spike propagation in neurons we postulate the average output pulse rate as a measure of dissipation. The dissipation increases monotonically with the input noise. We find that for small dissipation both models give a close to linear dependence of the channel information capacity on dissipation. In both models the channel information capacity, as a function of dissipation, has a maximum at input noise amplitude that is different from that in the standard signal-to-noise ratio versus input noise plot. Though a direct comparison is not straightforward, for small signals the threshold model gives appreciably higher density of information per dissipation than the exponential fluctuating rate model. We show that a formal introduction of cooperativity in the rate fluctuating model permits us to imitate the response function of the threshold model and to enhance performance. This finding may have direct relevance to real neural spike generation where, due to a strong positive feedback, the ion channel currents add up in a synchronized way.