Concepts
   Neuron
   Automata
   Entropy
   Waves
   Thoughts

Computation
   Synapses
   Coherence
   Neuralwaves

Sequences
   Circuits
   Clusters
   Code

Software
   Prism
   Kali











Home-> Concepts->Entropy and Information

October 31, 1999
Updated September 3, 2001
Updated October 27, 2011
Updated December 26, 2019


Entropy and Information

The unit of information used in neural systems should explicitly be a function of time like the unit of hertz. For example, the information in the time-independent sequence of letters S = {a, b, c} is unique for the object it represents for the entire duration that it exists. However, in a dynamical systems, the potential information in a sequence can be smaller or larger depending on the forces acting on the system. The unfolding of a sequence depends on the environment or system that the sequence represents.


Temporal Sequences and the Binary Interval

In the real physical world, as compared against a mathematical description of it, information is essentially contained in the states of matter or energy. Entropy is a descriptive attribute of the states of energy (or matter). In this very general context, information exists or emerges from within the states of energy.

Information can also be described mathematically. Information as an abstraction can be spoken and written of mathematically using a function describing the unfolding of a binary sequence which is the logarithm of base 2.

is the number of possible events or outcomes of the system. Entropy described mathematically is confined within the general equation above.

Mathematically speaking, the unit of information is the bit: either 0 or 1. The bit represents a time-independent unit of information. But information can be defined as an elemental binary unit which also changes as a function of time: f({0,1}; t).

Within the past five years, the possibility of the quantum computer has changed the notion of a bit. It's pushed forward the more general notion that the "unconditional" absolute duality of just two "binary" states is insufficient to describe the quantum world. This is because quanta are not isolated objects, but more like like an organism - a field. Electrons, the quanta we observe in the field, for example, do not exist independent of each other. They are, as physicists say, entangled. So this further pushes our notion of what probabilities are further into the forefront in a rather spectacular way. "There is no such thing as an unconditional probability." [1]

If you apply the notion of a qu-bit to a bit, you'd likely have to generalized your world view on how probabilities apply to reality. I can further say that pushing this kind of logic forward means you'd have to consider how fundamental the vibratory nature of energy at the level of how it's propagated or transferred or moved must affect your world view. That is, it would be natural to think of the universe having a probabilistic or statistical nature if you view the movement in the universe as movement of energy "waves". [2]

It's so fascinating to me that since the time I started these notes on entropy 20 years ago in 1999, that quantum physics has changed the idea of what a bit should be. The physicist Richard Feynman came up with ideas for using the qubit. [6] I kind of think that in the field of neural networks that temporal information is defined by the information in spike trains. This information, contained in superimposing and resonating waves, is temporal in nature. I called the measure of this temporal information, kind-of in fun, the bint which stands for "binary interval".

We could also measure information change more naturally as the rate-of-change of the information or bytes in a system. The frequency of a system, measured in hertz, is the rate-of-change of the oscillations in a system. So a system vibrating at 1 hertz means a bit is changing from 0 to 1 and back again in 1 second. Visualize the movement of 1 cycle of a sine wave. Now a sequence of notes on a sheet of music represent objects whose units are in hertz. But observe that notes on a music sheet represents a frequency that is constant in time. The representation of notes is in a subclass of the type of dynamic sequence described above.


Thoughts Are Continuous

Our thinking processes are the result of real physical or biological events occurring as a function of time. Our thinking processes exists as a real dynamical energy system. It is not a mathematical abstraction which is not real in the sense of having to exist in the physical world. We can think about mathematical processes, but these abstraction themselves do not have a real physical existence. This is in contrast to our thoughts which are created because of the real energetic neural synapses in our brains.

Our thoughts are created by real physical processes which are subject to the constraints of the Heisenberg uncertainty principle which limits the creation of our thoughts as a function of time. However, what we think is unlimited because what we think is not physically real.

Temporal sequences are intrinsically multi-dimensional in nature [3]. Information in the mind is created dynamically as a temporal sequence. The thoughts in our mind seem fleeting because they are generated or created everytime our neural waves move across parts of the brain. The action of thinking means that our brain creates a temporal sequence of energy patterns that only lasts for a very short duration. In order to hold a thought our brain needs to continually recreated these sequences of energy patterns or synapses.

In trying to create a model for the software tools I had try to try to rid myself of the old way of thinking about information. When the mind generates information, it only lasts in the order of a fraction of a second. Then it needs to be recreated. I'm going on faith from past physiological studies that the physical structures which produces these energy patterns are the coherent, synapsing neurons. In the software model, I've try to separate the local micro-ensemble effect of neurons from those global clustering effects in synchronous neural circuits, and then try to put the pieces back together.


Footnotes and References

[1] Ref: Appendix 3 and 4, Pattern Matching by Satosi Watanabe, 1985

Appendix 3 contains a enormously interesting discussion about how Charles Peirce thought about "predicates" as the term is used in predicate or maybe "distributed Boolean" logic.

Logic is a set of relations among propositions.
versus
Logic is a set of relations among predicates.

When you put the emphasis on action or verbs, ie., predicates, versus the subject, you enable the objects of the relationship to take on more values because predicates are innately embebbed in the object your talking about. This is exactly what generalizing thought processes do. [4]

More Notes:
The traditional way to view probability is from the classical view of physics. This view of probability arose from what we observed with our human eyes. It's a good and noble view. The view of probability from modern physics was formed by observations with instruments because our eyes couldn't see the atomic world of really small objects. Within the last 100 years most of our new knowledge about ourselves have come by using instruments to observe and measure events in nature and our own bodies. That's why physicists can think of information as being so multi-dimensional. It's because the energy states of matter are multi-dimensional. (For example, when they compute the eigenvalues of coupled-state electronic atoms, atoms that are coupled by electronic or molecular forces, they can come up with large ranges of eigenvalues. This is an unexpected kind of rise in complexity or order, and decrease in entropy. A kind of complexity that I call "divine." I'm surprised physicists have found a way to build real quantum computers. It seemed like an impossibility twenty years ago. But that's Nature isn't it - "divine".

[2] I appologize for being mystical. I'm hopelessly this way.

[3] Dennis Gabor asked [5], "... what it is that prevents any instrument from analysing the information area with an accuracy of less than a half unit. The ultimate reason for this is evident. We have made of a function of one varialbe -- time or frequency -- a function of two variables -- time and frequency." He said that is the mathematical identity which is at the root of the fundamental principle of communication. He said, "We see that the r.m.s. duration of a signal, and its r.m.s. frequency-width define a minimum area in the information diagram."

Further along, in section 4 of Dennis Gabor's Theory of Communication, he says, "Moreover, it suggests that it might be possible to give a more concrete interpretation to the information diagram by dividing it up into "cells" of size one half, and associating each cell with an "elementary signal" which transmitted exactly one datum of information."

[4] This also the kind of logic used in programming with lisp.

[5] Theory of Communication, Dennis Gabor, 1946,
The Journal of the Institution Of Electrical Engineers, 93(3):429-457.

[6] I met Richard Feynman at a particle physics conference in 1971, in which he spoke about partons, the quarks in nuclear matter. I never understood anything about quarks. I was a student standing there with Richard Feynman and I felt all the other great physicists were looking at us. I was rather scared of the whole situation. Professor Feynman sensed this, and then the bell rang and he had to go on stage. He look at me smiling, saying graciously, "bells make me nervous". This made me like him immediately.


next:   Waves