Neural backpropagation

Jump to navigation Jump to search

Template:Otheruses4

Neural backpropagation is the phenomena in which the action potential of a neuron creates a voltage spike both at the end of the axon (normal propagation) and back through to the dendritic arbor or dendrites, from which much of the original input current originated. It has been shown that this simple process can be used in a manner similar to the backpropagation algorithm used in multilayer perceptrons, a type of artificial neural network.

History

Since the 1950s, evidence has existed that neurons in the central nervous system generate an action potential, or voltage spike, the travels both through the axon to signal the next neuron and backpropagates through the dendrites sending a retrograde signal to its presynaptic signaling neurons. This current decays significantly with travel length along the dendrites, so effects are predicted to be more significant for neurons whose synapses are near the postsynaptic cell body, with magnitude depending mainly on sodium-channel density in the dendrite. It is also dependent on the shape of the dendritic tree and, more importantly, on the rate of signal currents to the neuron. On average, a backpropagating spike loses about half its voltage after traveling nearly 500 microns.

Backpropagation occurs actively in the neocortex, hippocampus, substantia nigra, and spinal cord, while in the cerebellum it occurs relatively passively. This is consistent with observations that synaptic plasticity is much more apparent in areas like the hippocampus, which controls memory, than the cerebellum, which controls more unconscious and vegetative functions.

The backpropagating current also causes a voltage change that increases the concentration of Ca2+ in the dendrites, an event which coincides with certain models of synaptic plasticity. This change also affects future integration of signals, leading to at least a short-term response difference between the presynaptic signals and the postsynaptic spike[1].

Algorithm

While a backpropagating action potential can presumably cause changes in the weight of the presynaptic connections, there is no simple mechanism for an error signal to propagate through multiple layers of neurons, as in the computer backpropagation algorithm. However, simple linear topologies have shown that effective computation is possible through signal backpropagation in this biological sense[2].

References

  1. Stuart, Greg (1997). "Action potential initiation and backpropagation in neurons of the mammalian CNS". TINS. 20 (3). Retrieved 2007-11-13. Unknown parameter |coauthors= ignored (help)
  2. Bogacz, Rafal (2000). "Frequency-based Error Back-propagation in a Cortical Network" (PDF). Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, Como (Italy). 2: 211–216. 0-7695-0619-4. Retrieved 2007-11-18. Unknown parameter |coauthors= ignored (help)

Template:Neuro-stub Template:Comp-sci-stub Template:WikiDoc Sources