Home->
Concepts->Limits of Information
November 19, 2010
Notes on Programming Neural Systems
Om Namo Shiva Nataraja
Cosmic Dancer
The Light of my Mind
The Limits of Information
I've always naively or intuitively thought that information
had its basis in physical matter or energy; that it could not
exist without energy. Information is an embedded attribute in
physical energy. The ancient thinkers would call an exposition
like this metaphysics or religion. But thinking like this is
required to build a model.
As I've integrated my studies of entropy into my world view, I've
personified it with a Hindu kind of mysticism. Entropy, which
is the physical law of time ordered limitations, is also about
the unending potential unfolding in Nature.
Placing constraints or limits on physical systems for doing abstract
analysis helps us to learn about the system, but sometimes is
not appropriate. For example, I think that the concept of infinity [1]
or zero applied in certain instances to describe Nature may not be
correct all the time. In certain cases in statistical mechancial theory,
we cannot consider a system as a "close" system isolated from the rest
of the world because the law breaks down in the more general cases [2].
In building models of our world, we necessarily have to put limitations
on its behavior, but we need to try to understand these limitations
carefully. Just as using the concept of infinity is a limitation,
the process of breaking down a system by analysis is setting up a
series of limiting conditions.
The measure of static information is a bit. But in real physical systems,
no object remains at a constant level of information as function of
time. The molecules and atoms are changing through radioactive decay
for example. Nature is dynamic, and information is not static.
Imagine extracting all the energy out of localized physical system. The
movement in this system will approach zero. This system will get cold
and rigid, and contain almost no information, but the law of entropy tries
its hardest to prevents this extreme condition. Physical systems get very
cold, but not as in taking the mathematical limit to zero.
The measure of instantaneous dynamic information is relative depending
on the temporal resolution of the detector. The resolution of the
detector is limited by the wave fields of physical matter expressed
in the uncertainty relationship between energy and time. In this case
there's a fundamental finite limit to the "smallness" of matter of which
Planck's constant is an attribute. Unlike in theoretical mathematics
there's no physically uncountable or infinite fields as in Cantor's set
theory. The use of infinite limits in numerical equations are constructs
of mathematics, and it sometimes blurs the real constraints imposed
by Nature such as the limitations in processing information.
I think of the mind as an information generator. But it's capacity
to create information is limited by the physical law of entropy. The
brain's ability to create information is constrained by the dynamical
flow of order. A good analogy to use is to compared the brain to a
steam engine. The steam engine transfers the energy in coal to water
to produce steam. The brain converts the energy in the food we eat
and air we breath into synaptic waves of electro-chemical currents to
produce thoughts.
The fundamental limitations in measuring physical variables can be
channeled down into a formula which summarizes the uncertainty principle.
With regards to information, this formula lies in the wave packet
Fourier transformations used in describing waves.
1.
There is a fifth dimension beyond that which is known to man.
It is a dimension as vast as space and as timeless as infinity.
It is the middle ground between light and shadow, between science
and superstition, and it lies between the pit of man's fears
and the summit of his knowledge. This is the dimension of imagination.
Rod Sterling's opening narration to season 1 of the TV series The Twilight Zone (1959).
2.
I'm talking about the perspective on viewing the objects as a part
of a larger whole, a Bohmian idea. It's confusing a first,
but makes thinking of physical systems easier after awhile.
