Home->
Computation->Coherence
Coherence
Synapses are the critical events in which information is created
or transformed as it moves between neurons. In our synthetic model
of the neuron, the synchronized transfer of messages occur within
specific time durations. These communicating automata inside the
computer can move messages in a timed broadcast method to groups
of automata waiting to service the request, or in a synchronous
point-to-point fashion from one automaton to the next one.
Real neurons receive "broadcast" messages from chemical releases.
For example the release of adrenalin in response to stress is like a
broadcast message which is transmitted to a wide class of nerves cells
in the heart, lung and legs as well as the brain. But when neurons need
to really "compute", they talk directly to each other in a synchronous
point-to-point fashion passing messages or synapsing in a relay chain.
Synchronized Synapsing Neurons
Neurons communicate in synchronicity or coherently to increase the rate
of information flow, that is, they fire or synapse in step which each
other. The signals in these synapses are propagated from one neuron to
the other in a point-to-point fashion. I'll first make a hardware analogy
between the computer and the human brain, then I'll postulate the reason
why neurons synapse in groups together in time like an orchestra playing.
Using a simplicistic analogy, software objects in a computer program can
communicate with each other in a orderly sequential fashion or
synchronously. The programmer can control when and what objects send
and receive messages. The timing of sending of a message in computer
can controlled to an accuracy of a micro-second.
The biological neurons in the brain which emit chemical messages and
synaptic electric currents cannot transfer information very quickly.
If the physical frequency of the rate of information transfer in the
brain was increased, the energy required just to "cool" the brain would
increase exponentially. Keeping the frequency of the neural circuits
low allows less energy to be expanded in the system as a whole. This
might explain Nature's strategy of having large set of neurons synapsing
coherently [1] to increasing the rate of information transfer.
The effort Nature would have to expand to build a fast serial circuit
would probably not be economical. Building a few fast serial processors
would still not provide the computational resources to manage the nervous
system. It's the same problem computer chip makers face. So Nature
builds massively parallel processors which I assume on faith are
relatively simple units.
Nature discovered by trail-and-error, probably from primitive cell groups,
that causing bursts of electro-chemical signals between themselves
could transfer relatively large quantities of information. The greater
the number of synchronized synapses, the greater the rate of information
transfer. The "engineering" contraints of brain building from a
biological viewpoint determined the most efficient architecture for
transferring a bulk of information. We still don't know for sure
exactly how information transfer works in real neurons.
Synchronizing Synapses in Time
The synapses occurring in the dendritic tree are associated with the signals
in the axonal tree if the synapses follow each other in time. The content
of information moving from the dendrites to the axon in the neuron are
filtered or transformed in the cell nucleus or soma. The synaptic
firing delay of the signal moving between neurons is controlled by the soma.
This delay causes the signal in the neural chain to be synchronized in
the propagation paths or channels in the neural circuits. The analog
of the soma's propagation delay factor can be made to the refractive index
of a wave moving through a dense medium. The refractive index is a
measure of the speed of wave propagation, and also determines the
wave's path.
The association or correlation of synaptic signals or messages, might
probably be due in part to the physical interference of the waveforms
of these messages. The physical interferences of waveforms are caused
primarily by propagation delays in the nerve cell soma.
Synchronizing Synapses in Space
The classical neural networks contains more of the spacial model. This
is in part because of the initial model's success at visual pattern
recognition. When a large set of neurons, an ensemble, all get input
signals (i_0, i_1, ..., i_n-1) arriving at each neuron's cell body
simultaneously, then we say that the neurons are synapsing coherently.
(I really still don't understand exactly how neurons resonant with
each other so that their firing times become so coherent. (2))
Each input signal, i, contains an associated weight, w, which is a
measure of the strength of the signal. However, I've given up development
in this spacially oriented model in favor of a temporal (resonance
model) long ago. I feel that the significance of using the parts of
spacial model will have a second order effect when used in circuits;
one which is small, and does not fit cleanly or simply with the temporal
one (3).
Footnotes
1.
Remember that coherence means order. Order usually implies a system
in which the number of states within system has decreased making the
system more certain. An ordered system has less entropy. Intuitively,
reducing the number of states in a system increases its order.
An extreme example is superconductivity in which the number of states
in the system decreases. The same is true for coherent laser light.
In terms of synchronization, coherence means how well information can
be transferred in a system or between systems. That is, the rate of
information transfer is high in a synchronized system.
Jackson Pollock: Lucifer, 1947;
photo: www.maws-gallery.co.uk
Coherence, for me, can also be related to aesthetics which has been
defined as the ratio of simplicity to complexity. With respect to an
entropic or informational definition for coherence, a simple system
would appear as if the rate of information being passed to the observer
is low, and a complex system would be the opposite. So using this
definition, a coherence factor near 1 is aesthetic, and a small (near 0)
or large factor is not pleasing. An aesthetic system depends on the
rate at which the observer can process incoming information.
(Of course there are many other factors that make art beautiful and
significant. The sense of the aesthetic with respect to coherence
is a narrow one. This is a Jackson Pollock like view of art.)
2.
The chaotic resonance [1] neurodynamics of Walter Freeman is mainly where
I got my ideas on neural coherence. There are so many great thinkers
in this field. Bernard Widrow is the person I remember studying when
I started programming neural networks.
3.
A temporal system in which the entropy decreases is when you approach
an idealized monochromatic wave. (According Heisenberg's uncertainty
principle, could you say that an idealized monochromatic plane wave
really exists? This wave would have an indefinitely large period, and
paradoxically would not be a wave.) The frequency states would approach
one which would mean perfect order (an idealization).
In physics lab, we used interferometers, which uses the temporal coherence
of waves, to measure the speed of light. It's amazing how using the
constructive interference of waves can lead to such high accuracy in
the measurement process. When you see, or "measure", the interference
pattern, you are measuring the "bell curve" distribution of the
intensity of the interference pattern.
References
[1] How Brains Make Chaos in Order to Make Sense of the World,
Christine Skarda, and Walter Freeman, Behavioral and Brain Sciences
(1987) 10, 161-195
next:
Neuralwaves