Home->
Concepts->Spikes
November 22, 2010
Revised: December 17, 2013
Notes on Programming Neural Systems
Nataraja, dispeller of Darkness
Fill my mind with your veil of Light
A Personal Note
I've been blessed to have learned physics from Satosi Watanabe.
I heard he had studied as a student under Werner Heisenberg doing
research on nuclear statistical mechanics. I was fortunate enough
to have attended his classes on statistical mechanics and quantum
theory. He used the text book on quantum theory written by David Bohm
in his class. The first day of his lecture on quantum theory
he spoke about wave packets and pilot waves.
I remember one sunny day when he was sitting on the lawn on campus.
I was 18 years old, filled with my own ideas on particle physics,
so I told him all about it. He always smiled, and graciously
told me to write it up. I never wrote anything, and since then
never remembered anything about the theory I talked with him about.
But I always remember his smiles.
In 1985, he published a book [1] on pattern recognition in which he
wrote:
... emphasize the fact that the interval between two spikes
is a continuous variable, hence the all-or-nothing theory
(i.e., the characterization of a neuron by a binary variable)
is inadequate.
Nature is too efficient to waste information in the spikes produced
by the neurons. Every bit of matter or energy created by the
neuron will be used intelligently for the purposes of relaying
information to other neurons. This is a statement of faith. So how
do we analyze this. How far down do we go?
I remember Satosi Watanabe's lectures on entropy, where he
explained that attributes of entropy is essentially information.
Statistical mechanics rules the universe. So information is
embedded in energy. Abstracted physical energy has an infinite
(a word humans can't really understand) array of properties,
but fortunately the idea (or the mess) can be be simplified in a
nice little equation. Note that information is not the same as
energy. Information exists inside the energy's distinct states.
An energy system with a few states means information must be dispersed.
The idea is that dynamical energy, in the unfolding of physical
processes, is constrained by an exponential suppression in the
number of physical states available to this particular system in
the instance of time before it makes its state transitions.
The energy states of a physical system viewed at statistically will tend
to settle into fewer states in time. The information will be dispersed
among these fewer states. Entropy also gives a direction to
the flow of time. It's in the statistical laws of the universe to
diminish the complexity or number of "information states" in localize
matter by dispersing it. I do not know why this is so [2], but this
is one of the basic laws of the universe.
But Nature is really divine. Although the entropy equation says the
rate of change of order is limited (localize, closed system, micro-entropy
always increases), fundamental forces in Nature get order to increase
(forces, open to energy, global macro-entropy over a long period of
time decreases). Why this is so seems like a paradox; subtle and
beautiful, only what the seemingly divine would do.
Physical systems are forced into higher complexity states all the time,
that is, entropy decreases [3], and information is congealed. Forces in
Nature like gravitational energy reshapes the atoms, planets and galaxies.
Einstein would say that gravitational forces are caused by space-time
itself. The atoms like silver and gold are created inside the core of
stars because Nature (as described by the law of entropy) has eliminated
certain states where energy could propagate. The law of entropy works
inside of stars and black holes. It is this seemingly divine
paradox that leads to complexity.
Inside the core of stars, the energy density is extremely high, but
information states must be moderated by entropy. A rise in energy
does not mean a proportional rise in information. Although the energy
is great inside of stars, the law of entropy says that the number
of states cannot be proportionally as high. The creation of states
is suppressed by the law of entropy. So inside of stars there are
essentially only states of high energy matter. The complex atoms
like gold that's created in stars is a consequence of this.
The electron is small in mass or energy compared to a proton which
are attributes of atoms. I think of the elementary particles inside
the atom as attributes of the atom, that is, an atom is a whole
unit. The electron is much harder to isolate than the proton because
it has so little energy. If you're trying to measure the electron
around the atom, it would be dispersed in a relatively wide area
around the center occupied by the proton. A stationary outer electron
of the atom (in quantum mechanics this means a coherent electron with
nearly zero energy) could be dispersed over an enormously wide area
orders of magnitude away from the center (in the atomic world it's
better to give really rough descriptions if you cannot make an exact
measurement). Whenever you move away from the center of energy,
complexity falls and uncertainty increases. This is another consequence
of entropy, information and the uncertainty principle.
For awhile, I carried around a book [4] written by David Bohm. In the
beginning of this book, chapter 1 on Fragmentation and Wholeness, he
wrote:
The new form of insight can perhaps best be called
Undivided Wholeness in Flowing Movement. This
view implies that flow is, in some sense, prior to that
of the 'things' than can be seen to form and dissolve
in this flow. One can perhaps illustrate what is meant
here by considering the 'stream of consciousness'. This
flux of awareness is not precisely definable, and yet
it is evidently prior to the definable forms of thoughts
and ideas which can be seen to form and dissolve in the
flux, like ripples, waves and vortices in a flowing
stream.
I think that what David Bohm is talking about is the current of energy
flowing in our neural circuits. It's the energy which gives rise to our
thoughts and creates our consciousness. The source of this order
from which this flowing movement arises is what the ancient Hindus
would call Dharma. David Bohm's "flowing stream" is the what the
Hindu sage Ramana Maharshi calls "The Self".
So we end here focused on our thoughts. It's the only possibility.
If you wanted to stretch this, like I do, then you could say
that the Hindu ascetics were right, and that nature's gift to
each of us is our consciousness. If there were no dynamic movement,
there would be no thoughts. My first impression on reading
"Wholeness and the Implicate Order" is Bohm's words about
movement as the essence of being. "Movement" is transformation
and metamorphosis, and a constant theme throughout his book.
In his introduction David Bohm wrote:
Such harmony is seen to be possible only if the world view
itself takes part in an unending process of development,
evolution, and unfoldment, which fits as part of the
universal process that is the ground of all existence.
This is my personal note. It's the ideas that has keep me doing this
and like Satosi Watanabe used to say always with a smile: "It is a quest!"
Kali Ma, goddess of change and time
Mother of the universe
I fear the stillness, and the cold
I fear the chaos, and the fire
Move me along, when I'm beyond time.
Reference
1.
Satosi Watanabe, Pattern Recognition: Human and Mechanical, 1985.,
p. 470
2.
Understanding the language of statistical mechanics takes awhile.
Entropy is a measure of the order in a physical system. It is
the only general law of we have of Nature that explicitly constraints
the propagation or transition of a system's energy states as a function
of time. Entropy exists to put very broad limitations on what is
possible. But it does this in a subtle way which requires deep
thinking. For example, the law of entropy does not tell us if
there is a limit to the amount of order which is being created or
destroyed. We essentially know from the mathematical formulas for
entropy, that it describes the rate of change in the order of things,
but not absolute quantities.
3.
Think of entropy increase as a rise in disorder, or decrease in
complexity or information. But if you transfer energy into a
system, then entropy will decrease, and the complexity or
information will rise. Erwin Schrodinger spoke about negative
entropy while referring to biological systems in which he meant that
order or complexity in a system was increasing. Some people called
this "negative entropy." It's negative because positive entropy
originally meant an increase in disorder. Very confusing at first,
in fact, it still confuses me sometimes.
4.
David Bohm, Wholeness and The Implicate Order, 1980.
