Beruflich Dokumente
Kultur Dokumente
107-127,1995
~ ) Pergamon Copyright © 1995 Elsevier Science LUt
Printed in Great Britain. All rights reserved
0732-118X/95 $9.50 + 0.00
0732-118X(94)00047-6
CHAOS THEORY AND THE EVOLUTION OF CONSCIOUSNESS
AND MIND: A THERMODYNAMIC-HOLOGRAPHIC
RESOLUTION TO THE M I N D - B O D Y PROBLEM
LARRY R. VANDERVERT*
American Nonlinear Systems, 711 W. Waverly Place, Spokane, WA 99205-3271, U.S.A.
*I gratefully acknowledge the many helpful comments and encouragements over the past several
years of Allan Combs, Professsor of Psychology, University of North Carolina-Asheville, and of Paul D.
MacLean, Senior Research Scientist, National Institute of Mental Health. This article was presented in
an invited paper at the 1993 National Convention of the American Psychological Association, Division-
24, Theoretical and Philosophical Psychology.
107
108 L.R. Vandervert
It is upon the more general patterns of the deep underlying dynamic order of
complexity in nature that attention will be focused. Of course, the mathematics of
chaos theory fit hand-in-glove with these patterns, but as it will be seen later within
the framework of Neurological Positivism (NP) the patterns have their origins in
the algorithmic organization of the brain, and therefore precede in time the
mathematical science that describes them. Whether we are studying turbulent flows
in rivers, dynamic energy hierarchies in ecosystems, or the four nested
characteristics of the h u m a n "stream of consciousness" delineated introspectively
by William James (1890), for example, the fundamental underlying organizational
patterns are in general the same.
For our purposes, a thumbnail summary of the chaos--theoretical features of
pattern that apply to dynamical systems such as the foregoing would be as follows:
1. The dynamical unfoldment of the patterns immanent in complex systems lies
behind the grasp of everyday experience. Phenomena such as those mentioned above
that may appear "chaotic" or totally unorganized and u n c o n n e c t e d in everyday
experience actually contain indwelling, interconnected deterministic order.
2. The order lying behind the appearance is that of self-similarity (not self-same)
and self-referentiality across scale and aspect of nature. This means that (a) a
general similarity of patterns repeats itself ad infinitum and more detail is revealed
across finer and finer scales of measurement, and (b) the similarity of pattern arises
from its pre-existing pattern (self-referentiality)--form arising dynamically from
pre-existing form creating a seamless web of organization and interconnectedness,
Darwin's central theme, and the mechanism that underlies maximum-power
principle evolution (see Appendix).
3. Self-similar, self-referential '~jumps" or rapid bifurcations occur in the system
in a periodic fashion. "Emergent" or nonlinear (non-additive) branchings occur
periodically in unfolding chaotic systems.
4. As the behavior of the system unfolds, small, seemingly inconsequential and
often dismissed initial conditions of the unfoldment may diverge nonlinearly in
rapid fashion toward unpredictable conditions. This feature is referred to as
sensitivity to initial conditions, or, metaphorically, the Butterfly Effect. The
Butterfly Effect metaphor is meant to convey in a simple way that a miscalculation
of conditions in a weather regime as small as that associated with the flapping of
butterfly or bird wings might surprisingly evolve nonlinearly toward unpredictable
major weather conditions. As with all metaphors, the Butterfly metaphor should be
taken with a grain of salt. In this paper, attention will be focused upon the self-
similarity and self-referentiality features of chaos.
*Although one might interpret NP's focus on the algorithmic organization of the brain as a skull-
bound solipsism, NP is not a solipsism. NP represents an "emergent hypothetical realism," that is a
variant on Campbell's (1959, 1974) "hypothetical realism" emphasizing emergence that comes with the
evolution of symbol systems. NP proposes that the algorithmic organization that becomes encapsulated
in the brain mirrors a real world accurately, but in addition proposes that the real world can be modeled
cognitively in symbolic form in a variety of unpredictable, but deterministic, emergent ways. The cognitive
models of, for example, Einstein's relativity, Grdel's incompleteness/inconsistency, Heisenberg's
uncertainty, are all implicated here. Compare NP's emergent hypothetical realism with, for example,
Lorenz (1941/1962, p. 29; 1973, 1977, chapter 1), MacLean (1990, chapters 1 and 29), and Pribram
(1986, 1991, pp. xxii-xxix).
110 L.R. Vandervert
*In NP an algorithmis any set of rules, including those which govern chaotic dynamical systems, that
perform translational or transformational operations (mathematical, mechanical, linguistic,
neurological, metaphorical, homological) that link problems, input data, and solutions. At the same
time algorithms cannot solveproblems (cannot do work) either in theory or practice unless transferred
in some thermodynamic manner. The thermodynamic methodsof heat and work constitute the
definitional bases for all algorithms. Algorithms are therefore also orderly methods of
energy/information transferthat will provide solutions to problems (see e.g., Atkins, 1984, chapter 2).
NP's definition of algorithm places the origins of all algorithms in the evolution of problem solving in
the brain as depicted, for example, in Figs 1 and 2. In the thermodynamic scheme of things, since no
change of any kind can occur without an energy/informationflow (not even an abstract thought about
algorithms, or an observation), there can be no such thing as "nonalgorithmic" problem solving.
Penrose (1989), for example, discussed the relationship of algorithms to mathematics, and to a small
degree mental processes. Penrose's description of algorithms occurs at the level of the mental models
of mathematics and artificial intelligence as if these mental models somehow represent the
fundamental nature of reality and then works backward to the problem solving organization of the
brain. He thus is forced to infer that some processes of consciousness--for example, intuition--must
be "nonalgorithmic." By describing algorithm at the mental model level only, some aspects of
consciousness for Penrose become inexplicable and thus his term nonalgorithmic. Yet, he uses terms
like "see" and "feel" in the brain process intuitionalsense as if these come to consciousness free of any
rules or methods (nonalgorithmic) at all--a rather surprising suggestion for a physicist! When it is
recognized that all algorithmic organization begins in the brain, and these, in turn, give rise to the
mental models of mathematics and artificial intelligence, the term nonalgorithmic could only mean
"nonbrain." In NP what Penrose calls nonalgorithmic are the deterministic, yet unpredictable features
of chaotic nonlinear dynamical systems. These are not ruleless nor methodless. For more complete
discussion of algorithms see Vandervert (1991, 1992).
Chaos theory and the evolution of consciousness 111
consistent modeling of consciousness, mind, and the mind-body relationship could
p r o c e e d only in a likewise fully interconnected and interdefined fashion.
Accordingly, the p r e s e n t p a p e r has t h r e e i n t e r c o n n e c t e d purposes: (a) to
describe the evolution o f an e n e r g y flow g u i d a n c e f r a m e w o r k (algorithmic
organization) for space-time consciousness, (b) to describe the subsequent energetic
exteriorization o f the m e n t a l models o f culture (collectively, mind) f r o m the
p r e a d a p t e d algorithmic organization of the brain described in (a), and (c) to
describe a holographic energy framework relationship between our experiences o f
"body" on the one hand, and "mind" on the other, as a generalized differentiation
of (a) and (b) above.
I n sum, t h r o u g h c o u n t l e s s i t e r a t i o n s o f L o t k a ' s s c e n a r i o ( e x t e n d e d to t h e t i m e
s c a l e a c r o s s p h y l a ) a n d its s p a c e - t i m e r e p r e s e n t a t i o n t h a t a p p e a r s in Fig. 1,
i s o m o r p h i c e n e r g y t r a n s f o r m s o f t h e space, time, a n d sense m o d a l i t i e s o f evolving
c r e a t u r e activity involving t h e p u r s u e r / p u r s u e d relationship b e c o m e e n c a p s u l a t e d
(literally, p l a c e d in a p r o t e c t i v e " c o n t a i n e r " ) in a c h a o t i c / f r a c t a l a l g o r i t h m i c
TIME1
~_~ predator
(pursuer)
TIME2
....... i ii!ii
Fig. 1. Countless millenniaof iterationsof selectiveprocesses in this generalizedprey-predator scenario resulted
in the encapsulationof space-time algorithmsin the brain. Algorithms are patterns of energy pathways (methods
of work) that solve problems.
Chaos theory and the evolution of consciousness 113
organization o f the nervous systems and their bodies of each generalized species.
As Crutchfield, Farmer, Packard, and Shaw (1986) suggested, the chaotic/fractal
nervous system that has b e e n selected has met the demands of the chaotic/fractal
possibilities o f an ever-changing environment:
The form, then, of any portion of matter, whether it be living or dead, and the
changes of form which are apparent in its movements and in its growth, may in
all cases alike be described as due to the action of force. In short, the form of an
object [including its detailed internal structure] is a 'diagram of forces', in this
sense, at least, that from it we can judge of or deduce the forces that are acting
or have acted upon it [italics added]. (p. 11)
Like the fish's fin that reflects the hydrodynamic properties of water (Lorenz, 1977,
p. 6), then, the h o l o n o m i c organization of the brain represents a "diagram o f
forces" that mirrors its own evolutionary history o f the c o m p l e x e n e r g e t i c
necessities o f survival.
Below is a description of the features o f the diagram of forces (as T h o m p s o n
would have put it) of holonomic processing:
1. The projection o f experience away from its locus of processing--the world
composed inside the skull appears to consist o f objects and events "out there." It is
difficult to imagine any workable evolutionary p r e y - p r e d a t o r scenario without such
holographic-type exteriorizing projection. In NP this automatic projection o f
e x p e r i e n c e away f r o m its actual locus in the brain is r e f e r r e d to as primordial
perceptual constancy, which is t h o u g h t to provide the g r o u n d and process o f
consciousness for all o t h e r constancies, for example, the color, size, and shape
constancies. (See Vandervert, 1990, pp. 9-11 for a discussion c o n c e r n i n g the
primordial constancy, and how it has influenced the development of science.)
2. T h e e n f o l d m e n t and storage o f multiple " h i d d e n " nested orders a m o n g
networks o f holoscapes which may be activated by attentional and intentional
processes. And, the immediate recognition Of any minute portion of a figure as the
entire figure even if never before consciously perceived in that particular fraction. The tip
o f a predator's ear, a muffled sound displaced in space and time, or an olfactory
nuance is immediately recognizable as the presence of the entire figure. (Many
images may be r e c o r d e d in a single hologram, and any portion o f a hologram
contains the entire hologram.) It is my view that Melzack's hardwired neuromatrix
is holographic.
3. T r e m e n d o u s storage capacity, rapid retrieval, and practically instantaneous
cross- and auto-correlation in computing power for processing 1 and 2 above.
In summary of this section, the space-time, and spectral, holonomic features of
the algorithmic organization of the brain provide a better picture o f the body
neuromatrix or "diagram o f forces" that reflects the overall dynamics of the Lotka
scenario. Pribram's holonomic model is absolutely critical, it seems, in that it allows
us to u n d e r s t a n d the g e n e r a t i o n o f holographic-like spatial a n d t e m p o r a l
experience o f p h a n t o m limbs; it allows us also to extend Melzack's neuromatrix
c o n c e p t i o n to aspects o f the sense modalities to u n d e r s t a n d holographic-like
p h a n t o m seeing and p h a n t o m hearing experiences (see Melzack, 1992, p. 123)
a m o n g those so impaired. In NP, consciousness is the continuously g e n e r a t e d
116 L.R. Vandervert
entirety of the pure h o l o n o m i c body in the b r a i n - - a continuously active body
universe o f s p a c e - t i m e , and spectra, including the constancy mechanisms.
Consciousness is this holonomic world of awareness composed within the skull: In
the next section the emergence of the exteriorization of symbol systems (including
mathematics) a n d m e n t a l models will be described as a natural evolutionary
extension of these holonomic features---form autocatalytically arising from p r e -
existing autocatalytic form. T h e s p a c e - t i m e , spectra algorithmic patterns o f
consciousness in the brain will thus be connected to the space-time, spectra of
mental models including those o f the mathematics o f chaos and h o l o g r a p h y - -
collectively, mind.
~"~ 0.~
II
'i .g x
'~
~' .i~
•~ ~ .~ ~ .~.
..I.
~. ~ ~ o
~.~°~
~ g.~ ~-° ~.~ .~ ~
g,,..~-~
Chaos theory and the evolution of consciousness 119
d i f f e r e n t i a t e s i t s e l f f r o m t h e l a r g e r d y n a m i c s o f t h e e n v i r o n m e n t . Mind a r i s e s
autocatalytically from the feedforward dynamic of the holonomic neuromatrix of
c o n s c i o u s n e s s in t h e f o r m o f c u l t u r a l l y s h a r a b l e m e n t a l m o d e l s . T h e f e e d f o r w a r d
d y n a m i c " m i n e s " t h e a l g o r i t h m i c o r g a n i z a t i o n o f t h e b r a i n in m a x i m u m - p o w e r
m a n n e r f o r m o d e l i n g p a t t e r n s t h a t i n c r e a s e t h e p r o b a b i l i t y o f e n e r g y flow available
f o r use b y t h e b r a i n . The mental models of mind are thereby mappable back onto the
neuromatrix of conscious experience--back onto experienced human reality. M i n d a n d b o d y
a r e algorithmically isomorphic.* W i t h c o n s c i o u s n e s s a n d m i n d d y n a m i c a l l y n e s t e d as
d e s c r i b e d above, we c a n u n d e r s t a n d t h e p e r p l e x i n g p r o b l e m o f h o w it is p o s s i b l e to
s i m p l y sit q u i e t l y t h i n k i n g , a n d c r e a t e a n e w idea, system o f e q u a t i o n s , o r t e c h n o l o g y
t h a t works in t h e "real world." C o n s c i o u s n e s s , t h i n k i n g , a n d m i n d a r e d y n a m i c a l l y
i n t e r c o n n e c t e d in a seamless w e b (see Fig. 2).
In our description of nature the purpose is not to disclose the real essence of the
p h e n o m e n a but only to track down, so far as it is possible, relations between the
manifold aspects of our experience. (p. 18)
Every behavior selected for study, every observation, and every interpretation,
requires subjective processing by an introspective observer. Logically, there is no
way of circumventing this or the other inescapable conclusion that the cold,
hard facts of science, like the firm pavement underfoot, are informational
transformations by the viscoelastic brain. No measurements obtained by the
hardware of the exact sciences are available for comprehension without
undergoing subjective transformation by the "software" of the brain. (p. 5)
Again, it is my view that this is, in fact, the only s p a c e - t i m e b o d y we can ever
feel, know, or analyze. W h e n we m o d e l the m i n d - b o d y relationship, it is always
this b o d y n e u r o m a t r i x that we have as r e f e r e n t . It is all we can ever m e a n by
"body."
Mind
T h r o u g h the energetic process of cultural evolution, mind (sharable mental
models c o n s t r u c t e d in the algorithmic organization o f the brain) arises
autocatalytically from the body neuromatrix and the rest of the preadapted brain
in accordance with the maximum-power principle (see Fig. 2 and Appendix). Mind
is palpable in the form of the neural circuitry patterns o f mental models in the brain
that reside in a nested fashion among those of the body neuromatrix in holonomic
networks of holoscapes. Thus, the body neuromatrix and the rest o f the preadapted
brain is projected into the fundamental algorithmic organization of mental models.
J o h n s o n (1987) has p r o p o s e d a t h o r o u g h g o i n g t h e o r y o f h u m a n m e a n i n g ,
imagination, and reason that is based u p o n m e t a p h o r and that squares well with
C h a o s t h e o r y a n d t h e e v o l u t i o n o f consciousness 121
NP's account of the evolution of mental models from the body neuromatrix.
However, Johnson has not provided evolutionary, or dynamical mechanisms for his
body-mind relationship theory. The dynamically nested relationship between body
and mind neuromatrices is depicted in Fig. 3.
J.
Fig. 3. Two images photocopied from the same hologram. Holograms, with their various features, represent a
fundamental way in which to demonstrate aspects of Pribram's (1991) holonomic brain theory which models
(both conceptually and mathematically) perceptual and cognitive processing going on in the brain. The many
regimes of information processing that are embeddable in holograms (as portrayed in two regimes in these
images of Arnold Schwarzenegger) illustrate in a simple way how the many experiences of body and mind
(mental models) might occupy the same space-time, spectral dynamical field. Note: From Polaroid Corporation,
Holography Business Group, Cambridge, MA. Computer adapted by permission.
122 L.R. Vandervert
Many other features of experience fall within the theoretical horizon of the
conception of a holonomically nested m i n d and body. For example, the
interpenetration of conscious and unconscious activity, the organization of dream
mentation, alternation of personality regimes in multiple personality disorders,
and perhaps the patterning of Jungian archetypes. All of these phenomena could
be viewed as nested children of the larger h o l o n o m i c m i n d - b o d y
differentiation.
REFERENCES
Abraham, R., Abraham, R., & Shaw, C. (1990). A visual introduction to dynamical systems theory
for psychology. Santa Cruz, CA: Aerial Press.
Ackoff, R., Gupta, S., & Minas, J. (1962). Scientific method: Optimizing applied research and
decisions. New York: John Wiley & Sons.
Atkins, P. W. (1984). The second law. New York: Scientific American Books.
Bell, E. T. (1937). Men of mathematics. New York: Simon and Schuster.
Bertalanffy, L. von. (1964). The mind-body problem: A new view. Psychosomatic Medicine, 26,
29-45.
Bertalanffy, L. yon. (1967). Robots, men and minds: Psychology in the modern world. New York:
George Braziller.
Bertalanffy, L. von. (1968). General system theory. New York: George Braziller.
Bohm, D. (1973). Quantum theory as an indication of a new order in physics. Part B.
Implicate and explicate order in physical law. Foundations of Physics, 3, 138-168.
Bohm, D., & Peat, F. D. (1987). Science, order, and creativity. NewYork: Bantam.
Bohr, N. (1934). Atomic theory and the description of nature. NewYork: MacMillan.
Boltzmann, L. (1905). The second law of thermodynamics. Dordrecht: Reidel.
Boltzmann, L. (1974). Theoreticalphysics andphilosophicalproblems. Dordrecht: Reidel.
Boring, E. (1950). A history ofexperimentalpsychology (2nd ed.). NewYork: Appleton-Century-
Crofts.
Campbell, D. T. (1959). Methodological suggestions from a comparative psychology of
knowledge process. Inquiry, 2, 152-182.
Campbell, D. T. (1974). Evolutionary epistemology. In P. A. Schilpp (Ed.), The philosophy of
KarIP~ Popper (pp. 413-463). LaSalle, IL: The Open Court Publishing Company.
Crutchfield, J., Farmer, J., Packard, N., & Shaw, R. (1986, December). Chaos. Scientific
American, 12, 46-57.
Fodor, J. A. (1981,January). The mind-body problem. ScientificAmencan, 244, 114-123.
Gevins, A. (1989). Signs of model making by the human brain. In E. Basar & T. H. Bullock
(Eds.), Springer series in brain dynamics 1 (pp. 408-419). Berlin: Springer-Verlag.
Gleick, J. (1987). Chaos: The making of a new science. NewYork: Viking Press.
Hawking, S. (1988). A brief history of time. NewYork: Bantam.
Herbert, N. (1985). Quantum reality. Garden City, NY: Anchor Press/Doubleday.
Hofstadter, D. (1979). Gb'del, Escher, Bach: The eternal golden braid. New York: Basic Books.
Holton, G. (1979). Constructing a theory: Einstein's model. The American Scholar, 48,
309-339.
James, W. (1890). The principles of psychology. NewYork: Holt, Rinehart & Winston.
Johnson, M. (1987). The body in the mind: The bodily basis of meanin~ imagination, and reason.
Chicago: The University of Chicago Press.
Koestler, A. (1964). The act of creation. NewYork: The MacMillan Company.
Lorenz, K. (1962). Kant's doctrine of the a priori in light of contemporary biology (C.
Ghurye, Trans.). Generalsystems: The yearbook of the societyfor general systems research (Volume
Chaos theory and the evolution of consciousness 125
VIII, pp. 23--35). Ann Arbor, MI: University of Michigan Press. (Original work published
1941)
Lorenz, K. (1977). Behind the mirror: A searchfor a natural history of human knowledge (R. Taylor,
Trans.). New York: Harcourt Brace Jovanovich. ( Original work published 1973 )
Lotka, A.J. (1922). A contribution to the energetics of evolution. Proceedings of the National
Academy of Science, 8, 140-155.
Lotka, A.J. (1932). Contribution to the mathematical theory of capture. Proceedings of the
NationaI Academy of Sciences, 18, 172-178.
Lotka, A.J. (1945). The law of evolution as a maximal principle. Human Biology, 17, 167-194.
MacLean, P. (1975). On the evolution of three mentalities. Man-Environment Systems, 5,
213-224.
MacLean, P. (1990). The triune brain in evolution: Role in paleocerebral functions. New York:
Plenum.
MacLean, P. (1991). Neofrontocerebellar evolution in regard to computation and
prediction: Some fractal aspects ofmicrogenesis. In R. Hanlon (Ed.), Cognitive microgenesis
(pp. 3-31). NewYork: Springer-Verlag.
Melzack, R. (1992, April). Phantom limbs. ScientificAraerican, 266, 120-126.
Moravec, H. (1992). The universal robot. Speculations in Science and Technology, 15, 286-294.
Nagel, E., & Newman,J. R. (1958). Gb'del'sproof NewYork: NewYork University Press.
Odum, H. T. (1983). Systems ecology: An introduction. New York: John Wiley and Sons.
Odum, H. T. (1988). Self-organization, transformity, and information. Science, 242,
1132-1139.
Odum, H. T., & Odum, E. C. (1981). Energy basis for man and nature. NewYork: McGraw-Hill.
O'Keefe, J. (1985). Is consciousness the gateway to the hippocampal cognitive map? A
speculative essay on the neural basis of mind. In D. Oakley (Ed.), Brain & mind (pp.
59-98). London: Methuen.
O'Keefe,J., & Nadel, L. (1978). The hippocampus as a cognitive map. Oxford: Oxford University
Press.
Penrose, R. (1989). The emperor's new mind: Concerning computers, minds, and the laws of physics.
New York: Oxford University Press.
Pribram, K. (1986). The cognitive revolution and brain/mind issues. American Psychologist, 41,
507-520.
Pribram, K. (1991). Brain and perception: Holonomy and structure in figural processing. Hillsdale,
NJ: Lawrence Erlbaum.
Stadler, M., & Kruse, P. (1990). Cognitive systems as self-organizing systems. In W. Krohn, G.
Kupers, & H. Nowotny (Eds.), Selforganization: Portrait of a scientific revolution (pp. 181-193).
Dordrecht: Kluwer Academic Publishers.
Steen, L. A. (1988, April 29). The science of patterns. Science, 240, 611-616.
Thompson, D. (1917). On growth and form. London: Cambridge University Press.
Troncale, L., Lee, C., Nelson, K., & Tres, J. (1990). Six case studies of empirically-based
chaos/fractal research: Does chaos/fractal analysis yield isomorphies for general systems
theories. In B. H. Banathy & B. A. Banathy (Eds.), Proceedings of the Thirty-Fourth Annual
Meeting of the International Society for the Systems Sciences (pp. 1157-1164). Portland, OR:
International Society for the Systems Sciences.
Vandervert, L. R. (1988). Systems thinking and a proposal for a neurological positivism.
Systems Research, 5, 313-321.
Vandervert, L. R. (1990). Systems thinking and neurological positivism: Further elucidations
and implications. Systems Research, 7, 1-17.
Vandervert, L. R. (1991). A measurable and testable brain-based emergent interactionism:
An alternative to Sperry's mentalist emergent interactionism. Journal of Mind and Behavior,
12, 201-209.
Vandervert, L. R. (1992). The emergence of brain and mind amid chaos through maximum
power evolution. World Futures: TheJournal of General Evolution, 33, 253-273.
Vandervert, L. R. (1993). Neurological positivism's evolution of mathematics.Journal of Mind
and Behavior, 14, 277-288.
126 L.R. Vandervert
APPENDIX
Maximum-Power Principle. The maximum-power principle may be stated as
follows: Those systems that survive in the competition among alternative choices are those
that develop more power inflow and use it to meet the needs of survival. They do this by:
(1) developing storages of high-quality energy; (2) feeding back work from the
storages to increase inflows (self-referentially); (3) recycling materials as
needed; (4) organizing control mechanisms that keep the system adapted and
stable; (5) setting up exchanges with other systems to supply special energy
needs; and (6) contributing useful work to the surrounding environmental
system that helps maintain favorable conditions (Odum & Odum, 1981, pp.
32-33). Lotka (1922, 1945) f o r m u l a t e d the maximum-power principle,
suggesting that systems prevail that develop designs that maximize the flow of
useful energy through feedback. These feedback designs are sometimes called
autocatalytic (self-releasing, or feeding upon self) (Odum, 1983, p. 6). The
autocatalytic relationship between brain and mind is one of positive-feedback,
whereby energy dissipation is progressively accelerated. The maximum-power
principle is a macro principle of self-organization (Odum, 1988; Vandervert,
1991) that itself becomes encapsulated in the algorithmic organization of the
h u m a n brain, thus giving rise to the possibility of the evolution of the
holonomic neural subcircuitry of the mind, It is manifest, for example, in the
self-organization of the perceptual constancies, cognitive systems (e.g., Stadler
& Kruse, 1990), and in the autocatalytic emergence of mental models
(Vandervert, 1991, 1993).
The autocatalytic feeding back of higher-quality energy into increased energy
inflow into the brain, as described by the maximum-power principle, can be viewed
as an energy appearance of chaotic/fractal organization. Odum (1988) expresses
this relationship in the following manner:
Ecosystems, earth systems, astronomical systems, and possibly all systems are
organized in hierarchies because this design maximizes useful energy
processing. These systems look different until they are drawn with energy
diagrams. . . . The series of energy transformations in the hierarchies formed by
self-organizationare cascades of successive energy fractions, which explains why
Mandelbrot's (1977) fractals often describe nature. (pp. 1133-1135)
The relationship between the algorithmic organization of the brain and the
algorithmic organizations of its mental model circuitry configurations can be
viewed either in terms of self-referentialenergy flows (causal pathways), as Odum has
done, or in terms of self-referential chaotic/fractal energy designs (Vandervert,
1991).
Recognition of the equivalence between the fundamental characteristics of
maximum-power-principle hierarchies and those of chaotic-fractal energy designs,
namely self-similarity (not serf-sameness), self-referencing, and infinite hierarchy
nesting, leads to the central principle of NP governing the cultural evolution of
mind: in its production of mental-model brain circuitry, the algorithmic
organization of the brain can increase energy inflows to itself most efficiently
(maximum-power principle) by feeding back algorithmic mental model designs
that are similar to itself. This is so because the brain, being at the top of the energy
Chaos theory and the evolution of consciousness 197
hierarchy, represents the highest-quality energy configuration and could not feed
back designs other than or superior to its own. Thus the brain is constrained to
create its mental models, for example, science, art, technology, computing systems,
virtual reality, in its own dynamical image, so to speak.