Beruflich Dokumente
Kultur Dokumente
Introduction
The following is extracted in the main from the on line eccyclopedia Wikipedia.
http://en.wikipedia.org and is intended to give a simple overview. In some cases this is by
analogy and this should not be taken as any lack of robustness to the concept.
For non English speakers the on line Wikipedia has an automatic translation facility.
There is also a listing of Quantum Physics and Quantum Biology related books available
in Biofeedback Resource Books.pdf available on http://imune.net/index/downloads.html
Introduction..................................................................................................................... 1
Thermodynamics............................................................................................................. 2
Entropy............................................................................................................................ 3
Quantum biology ............................................................................................................ 4
Quantum World: Instant Expert...................................................................................... 4
The birth of an idea ............................................................................................. 4
Quantum weirdness............................................................................................. 5
Uncertainty rules................................................................................................. 5
Secure networks .................................................................................................. 6
Quantum gravity ................................................................................................. 6
Economies of scale ............................................................................................. 7
Quantum Mechanics ....................................................................................................... 7
Background ................................................................................................................. 8
Old quantum theory .................................................................................................... 9
Planck's constant ......................................................................................................... 9
Bohr atom.................................................................................................................. 10
Wave-particle duality................................................................................................ 10
Development of modern quantum mechanics........................................................... 11
Schrödinger wave equation....................................................................................... 12
Uncertainty Principle ................................................................................................ 12
Quantum entanglement ............................................................................................. 14
Interpretations- the quantum micro and the newtonian macro world. ...................... 15
Consciousness causes collapse ................................................................................. 15
Quantum Electrodynamics............................................................................................ 16
Physical interpretation of QED................................................................................. 16
History....................................................................................................................... 17
Butterfly effect .............................................................................................................. 18
Fractal ........................................................................................................................... 20
History....................................................................................................................... 20
The fractional dimension of the boundary of the Koch snowflake........................... 22
Generating fractals .................................................................................................... 22
Classification of fractals ........................................................................................... 23
Fractals in nature....................................................................................................... 23
Thermodynamics
Thermodynamics (from the Greek thermos meaning heat and dynamis meaning power)
is a branch of physics that studies the effects of changes in temperature, pressure, and
volume on physical systems at the macroscopic scale by analyzing the collective motion
of their particles using statistics. Roughly, heat means "energy in transit" and dynamics
relates to "movement"; thus, in essence thermodynamics studies the movement of energy
and how energy instills movement. Historically, thermodynamics developed out of the
need to increase the efficiency of early steam engines.
The starting point for most thermodynamic considerations are the laws of
thermodynamics, which postulate that energy can be exchanged between physical
systems as heat or work. They also postulate the existence of a quantity named entropy,
which can be defined for any system. In thermodynamics, interactions between large
ensembles of objects are studied and categorized. Central to this are the concepts of
system and surroundings. A system is composed of particles, whose average motions
define its properties, which in turn are related to one another through equations of state.
Properties can be combined to express internal energy and thermodynamic potentials are
useful for determining conditions for equilibrium and spontaneous processes.
With these tools, thermodynamics describes how systems respond to changes in their
surroundings. This can be applied to a wide variety of topics in science and engineering,
such as engines, phase transitions, chemical reactions, transport phenomena, and even
black holes. The results of thermodynamics are essential for other fields of physics and
for chemistry, chemical engineering, cell biology, biomedical engineering, and materials
science to name a few.
Entropy
An important law of physics, the second law of thermodynamics, states that the total
entropy of any isolated thermodynamic system tends to increase over time, approaching a
maximum value. Unlike almost all other laws of physics, this associates thermodynamics
with a definite arrow of time. However, for a universe of infinite size, which
cannot be regarded as an isolated system, the second law does not apply.
Quantum biology
The following biological phenomena have been studied in terms of quantum processes:
phenomena such as radioactivity and antimatter, and no other theory can match its
description of how light and particles behave on small scales.
But it can also be mind-bending. Quantum objects can exist in multiple states and places
at the same time, requiring a mastery of statistics to describe them. Rife with uncertainty
and riddled with paradoxes, the theory has been criticised for casting doubt on the notion
of an objective reality - a concept many physicists, including Albert Einstein, have found
hard to swallow.
Today, scientists are grappling with these philosophical conundrums, trying to harness
quantum's bizarre properties to advance technology, and struggling to weave quantum
physics and general relativity into a seamless theory of quantum gravity.
Quantum theory began to take shape in the early 20th century, when classical ideas failed
to explain some observations. Previous theories allowed atoms to vibrate at any
frequency, leading to incorrect predictions that they could radiate infinite amounts of
energy - a problem known as the ultraviolet catastrophe.
In 1900, Max Planck solved this problem by assuming atoms can vibrate only at specific,
or quantised, frequencies. Then, in 1905, Einstein cracked the mystery of the
Quantum weirdness
Other interpretations of quantum theory - of which there are at least half a dozen - deal
with the measurement problem by suggesting even more far-fetched concepts than a
universe dependent on measurement. The popular many worlds interpretation suggests
quantum objects display several behaviours because they inhabit an infinite number of
parallel universes.
Uncertainty rules
For about 70 years, this wave-particle duality was explained by another unsettling tenet
of quantum theory - the Heisenberg uncertainty principle. Formulated by Werner
Heisenberg in 1927 and recently made more precise, the theory puts an upper limit on
knowledge. It says one can never know both the position and momentum of a quantum
object - measuring one invariably changes the other.
Bohr defeated Einstein in a series of thought experiments in the 1920s and 1930s using
this principle, but more recent work suggests the underlying cause of the duality seen in
experiments is a phenomenon called entanglement.
Entanglement is the idea that in the quantum world, objects are not independent if they
have interacted with each other or come into being through the same process. They
become linked, or entangled, such that changing one invariably affects the other, no
matter how far apart they are - something Einstein called "spooky action at a distance".
This may be involved in superconductivity and may even explain why objects have mass.
It also holds promise for "teleporting" particles across vast distances - assuming everyone
agrees on a reference frame. The first teleportation of a quantum state occurred in 1998,
and scientists have been gradually entangling more and more particles, different kinds of
particles, and large particles.
Secure networks
In April 2004, Austrian financial institutions performed the first money transfer
encrypted by quantum keys, and in June, the first encrypted computer network with more
than two nodes was set up across 10 kilometres in Cambridge, Massachusetts, US.
But keeping quantum particles entangled is a tricky business. Researchers are working on
how to maximise the particles' signal and distance travelled. Using a sensitive photon
detector, researchers in the UK recently sent encrypted photons down the length of a 100-
kilometre fibre optic cable. Researchers in the US devised a scheme to entangle
successive clouds of atoms in the hopes of one day making a quantum link between the
US cities of Washington, DC, and New York.
Lightning-fast computers
Quantum computers are another long-term goal. Because quantum particles can exist in
multiple states at the same time, they could be used to carry out many calculations at
once, factoring a 300-digit number in just seconds compared to the years required by
conventional computers.
But to maintain their multi-state nature, particles must remain isolated long enough to
carry out the calculations - a very challenging condition. Nonetheless, some progress has
been made in this area. A trio of electrons, the building blocks of classical computers,
were entangled in a semiconductor in 2003, and the first quantum calculation was made
with a single calcium ion in 2002. In October 2004, the first quantum memory component
was built from a string of caesium atoms.
But particles of matter interact so easily with others that their quantum states are
preserved for very short times - just billionths of a second. Photons, on the other hand,
maintain their states about a million times longer because they are less prone to interact
with each other. But they are also hard to store, as they travel, literally, at the speed of
light.
In 2001, scientists managed to stop light in its tracks, overcoming one practical hurdle.
And the first quantum logic gate - the brains behind quantum computers - was created
with light in 2003.
Quantum gravity
While three of the four fundamental forces of nature - those operating on very small
scales - are well accounted for by quantum theory, gravity is its Achilles heel. This force
works on a much larger scale and quantum theory has been powerless so far to explain it.
A number of bizarre theories have been proposed to bridge this gap, many
of which suggest that the very fabric of space-time bubbles up with random quantum
fluctuations - a foam of wormholes and infinitesimal black holes.
Such a foam is thought to have filled the universe during the big bang, dimpling space-
time so that structures such as stars and galaxies could later take shape.
The most popular quantum gravity theory says that particles and forces arise from the
vibrations of tiny loops - or strings - just 10-35 metres long. Another says that space and
time are discrete at the smallest scales, emerging from abstractions called "spin
networks".
One recent theory, called "doubly special relativity", tweaks Einstein's idea of one cosmic
invariant - the speed of light - and adds another at a very small scale. The controversial
theory accounts for gravity, inflation, and dark energy. Physicists are now devising
observations and experiments that could test the competing theories.
Economies of scale
Quantum physics is usually thought to act on light and particles smaller than molecules.
Some researchers believe there must be some cut-off point where classical physics takes
over, such as the point where the weak pull of gravity overwhelms other forces (in fact,
gravity's effect on neutrons was recently measured). But macroscopic objects can obey
quantum rules if they don't get entangled.
Certainly, harnessing troops of atoms or photons that follow quantum laws holds great
technological promise. Recent work cooling atoms to near absolute zero have produced
new forms of matter called Bose-Einstein and fermionic condensates. These have been
used to create laser beams made of atoms that etch precise patterns on surfaces, and
might one day lead to superconductors that work at room temperature.
All of these hopes suggest that, as queasy as quantum can be, it remains likely to be the
most powerful scientific cure-all for years to come.
Quantum Mechanics
Quantum mechanics is a physical science dealing with the behaviour of matter and
waves on the scale of atoms and subatomic particles. It also forms the basis for the
contemporary understanding of how large objects such as stars and galaxies, and
cosmological events such as the Big Bang, can be analyzed and explained. Its acceptance
by the general physics community is due to its accurate prediction of the physical
behaviour of systems, including systems where Newtonian mechanics fails. This
difference between the success of classical and quantum mechanics is most often
observed in systems at the atomic scale or smaller, or at very low or very high energies,
or at extremely low temperatures. Quantum mechanics is the basis of modern
Background
Despite the success of quantum mechanics, it does have some controversial elements. For
example, the behaviour of microscopic objects described in quantum mechanics is very
different from our everyday experience, which may provoke an incredulous reaction.
Moreover, some of the consequences of quantum mechanics appear to be inconsistent
with the consequences of other successful theories, such as Einstein's Theory of
Relativity, especially general relativity.
Some of the background of quantum mechanics dates back to the early 1800's, but the
real beginnings of quantum mechanics date from the work of Max Planck in 1900[1].
Albert Einstein[2], Niels Bohr[3], and Louis de Broglie[4] soon made important
contributions. However, it was not until the mid-1920's that a more complete picture
emerged, and the true importance of quantum mechanics became clear. Some of the most
prominent scientists to contribute were Max Born[5], Paul Dirac[6], Werner Heisenberg[7],
Wolfgang Pauli[8], and Erwin Schrödinger[9].
Later, the field was further expanded with work by Julian Schwinger, Murray Gell-Mann,
and Richard Feynman, in particular, with the development of Quantum Electrodynamics
in 1947.
Planck's constant
Classical physics predicted that a black-body radiator would produce infinite energy, but
that result was not observed in the laboratory. If black-body radiation was dispersed into
a spectrum, then the amount of energy radiated at various frequencies rose from zero at
one end, peaked at a frequency related to the temperature of the radiating object, and then
fell back to zero. In 1900, Max Planck developed an empirical equation that could
account for the observed energy curves, but he could not harmonize it with classical
theory. He concluded that the classical laws of physics do not apply on the atomic scale
as had been assumed.
Bohr atom
In 1913, Niels Bohr removed this substantial problem by applying the idea of discrete
(non-continuous) quanta to the orbits of electrons. This account became known as the
Bohr model of the atom. Bohr basically theorized that electrons can only inhabit certain
orbits around the atom. These orbits could be derived by looking at the spectral lines
produced by atoms.
Wave-particle duality
The Bohr atom model was enlarged upon with the discovery by de Broglie that the
electron has wave-like properties. In accord with de Broglie's conclusions, electrons can
only appear under conditions that permit a standing wave. A standing wave can be made
if a string is fixed on both ends and made to vibrate (as it would in a stringed instrument).
That illustration shows that the only standing waves that can occur are those
with zero amplitude at the two fixed ends. The waves created by a stringed instrument
appear to oscillate in place, simply changing crest for trough in an up-and-down motion.
A standing wave can only be formed when the wave's length fits the available vibrating
entity. In other words, no partial fragments of wave crests or troughs are allowed. In a
round vibrating medium, the wave must be a continuous formation of crests and troughs
all around the circle. Each electron must be its own standing wave in its own discrete
orbital.
Werner Heisenberg developed the full quantum mechanical theory in 1925 at the young
age of 23. Following his mentor, Niels Bohr, Werner Heisenberg began to work out a
theory for the quantum behavior of electron orbitals. Because electrons could not be
observed in their orbits, Heisenberg went about creating a mathematical description of
quantum mechanics built on what could be observed, that is, the light emitted from atoms
in their characteristic atomic spectra. Heisenberg studied the electron orbital on the model
of a charged ball on a spring, an oscillator, whose motion is anharmonic (not quite
regular). For a picture of the behavior of a charged ball on a spring see: Vibrating
Charges. Heisenberg first explained this kind of observed motion in terms of the laws of
classical mechanics known to apply in the macro world, and then applied quantum
restrictions, discrete (non-continuous) properties, to the picture. Doing so causes gaps to
appear between the orbitals so that the mathematical description he formulated would
then represent only the electron orbitals predicted on the basis of the atomic spectra.
In 1925 Heisenberg published a paper (in Z. Phys. vol. 33, p. 879-893) entitled
"Quantum-mechanical re-interpretation of kinematic and mechanical relations." So ended
the old quantum theory and began the age of quantum mechanics. Heisenberg's paper
gave few details that might aid readers in determining how he actually contrived to get
his results for the one-dimensional models he used to form the hypothesis that proved so
useful. In his paper, Heisenberg proposed to "discard all hope of observing hitherto
unobservable quantities, such as the position and period of the electron," and restrict
himself strictly to actually observable quantities. He needed mathematical rules for
predicting the relations actually observed in nature, and the rules he produced worked
differently depending on the sequence in which they were applied. "It quickly became
clear that the non-commutativity (in general) of kinematical quantities in quantum theory
was the really essential new technical idea in the paper." (Aitchison, p. 5) But it was
unclear why this non-commutativity was essential. Could it have a physical
interpretation? At least the matter was made more palatable when Max Born discovered
that the Heisenberg computational scheme could be put in a more familiar form present in
elementary mathematics.
Heisenberg approached quantum mechanics from the historical perspective that treated an
electron as an oscillating charged particle. Bohr's use of this analogy had already allowed
him to explain why the radii of the orbits of electrons could only take on certain values. It
followed from this interpretation of the experimental results available and the quantum
Because particles could be described as waves, later in 1925 Erwin Schrödinger analyzed
what an electron would look like as a wave around the nucleus of the atom. Using this
model, he formulated his equation for particle waves. Rather than explaining the atom by
analogy to satellites in planetary orbits, he treated everything as waves whereby each
electron has its own unique wavefunction. A wavefunction is described in Schrödinger's
equation by three properties (later Paul Dirac added a fourth). The three properties were
(1) an "orbital" designation, indicating whether the particle wave is one that is closer to
the nucleus with less energy or one that is further from the nucleus with more energy, (2)
the shape of the orbital, i.e. an indication that orbitals were not just spherical but other
shapes, and (3) the magnetic moment of the orbital, which is a manifestation of force
exerted by the charge of the electron as it rotates around the nucleus.
These three properties were called collectively the wavefunction of the electron and are
said to describe the quantum state of the electron. "Quantum state" means the collective
properties of the electron describing what we can say about its condition at a given time.
For the electron, the quantum state is described by its wavefunction which is designated
in physics by the Greek letter ψ (psi, pronounced "sigh"). The three properties of
Schrödinger's equation that describe the wavefunction of the electron and therefore also
describe the quantum state of the electron as described in the previous paragraph are each
called quantum numbers
Uncertainty Principle
In 1927, Heisenberg made a new discovery on the basis of his quantum theory that had
further practical consequences of this new way of looking at matter and energy on the
atomic scale. In Heisenberg's matrix mechanics formula, Heisenberg had encountered an
error or difference of h/2π between position and momentum. This represented a deviation
of one radian of a cycle when the particle-like aspects of the wave were examined.
Heisenberg analyzed this difference of one radian of a cycle and divided the difference or
deviation of one radian equally between the measurement of position and momentum.
This had the consequence of being able to describe the electron as a point particle in the
center of one cycle of a wave so that its position would have a standard deviation of plus
or minus one-half of one radian of the cycle (1/2 of h-bar). A standard deviation can be
either plus or minus the measurement i.e. it can add to the measurement or
subtract from it. In three-dimensions a standard deviation is a displacement in any
direction. What this means is that when a moving particle is viewed as a wave it is less
certain where the particle is. In fact, the more certain the position of a particle is known,
the less certain the momentum is known. This conclusion came to be called "Heisenberg's
Indeterminacy Principle," or Heisenberg's Uncertainty Principle. To understand the real
idea behind the uncertainty principle imagine a wave with its undulations, its crests and
troughs, moving along. A wave is also a moving stream of particles, so you have to
superimpose a stream of particles moving in a straight line along the middle of the wave.
An oscillating ball of charge creates a wave larger than its size depending upon the length
of its oscillation. Therefore, the energy of a moving particle is as large as the cycle of the
wave, but the particle itself has a location. Because the particle and the wave are the same
thing, then the particle is really located somewhere in the width of the wave. Its position
could be anywhere from the crest to the trough. The math for the uncertainty principle
says that the measurement of uncertainty as to the position of a moving particle is one-
half the width from the crest to the trough or one-half of one radian of a cycle in a wave.
For moving particles in quantum mechanics, there is simply a certain degree of exactness
and precision that is missing. You can be precise when you take a measurement of
position and you can be precise when you take a measurement of momentum, but there is
an inverse imprecision when you try to measure both at the same time as in the case of a
moving particle like the electron. In the most extreme case, absolute precision of one
variable would entail absolute imprecision regarding the other.
The consequences of the uncertainty principle were that the electron could no longer be
considered as in an exact location in its orbital. Rather the electron had to be described by
every point where the electron could possibly inhabit. By creating points of probable
location for the electron in its known orbital, this created a cloud of points in a spherical
shape for the orbital of a hydrogen atom which points gradually faded out nearer to the
nucleus and farther from the nucleus. This is called a probability distribution. Therefore,
the Bohr atom number n for each orbital became known as an n-sphere in the three
dimensional atom and was pictured as a probability cloud where the electron surrounded
the atom all at once.
This led to the further description by Heisenberg that if you were not making
measurements of the electron that it could not be described in one particular location but
was everywhere in the electron cloud at once. In other words, quantum mechanics cannot
give exact results, but only the probabilities for the occurrence of a variety of possible
results. Heisenberg went further and said that the path of a moving particle only comes
into existence once we observe it. However strange and counter-intuitive this assertion
may seem, quantum mechanics does still tell us the location of the electron's
orbital, its probability cloud. Heisenberg was speaking of the particle itself, not its orbital
which is in a known probability distribution.
It is important to note that although Heisenberg used infinite sets of positions for the
electron in his matrices, this does not mean that the electron could be anywhere in the
universe. Rather there are several laws that show the electron must be in one localized
probability distribution. An electron is described by its energy in Bohr's atom which was
carried over to matrix mechanics. Therefore, an electron in a certain n-sphere had to be
within a certain range from the nucleus depending upon its energy. This restricts its
location. Also, the number of places an electron can be is also called "the number of cells
in its phase space". The uncertainty principle set a lower limit to how finely one can chop
up classical phase space. Therefore, the number of places that an electron can be in its
orbital becomes finite due to the Uncertainty Principle. Therefore, an electron's location
in an atom is defined to be in its orbital and its orbital although being a probability
distribution does not extend out into the entire universe, but stops at the nucleus and
before the next n-sphere orbital begins and the points of the distribution are finite due to
the Uncertainty Principle creating a lower limit.
Classical physics had shown since Newton that if you know the position of stars and
planets and details about their motions that you can predict where they will be in the
future. For subatomic particles, Heisenberg denied this notion showing that due to the
uncertainty principle one cannot know the precise position and momentum of a particle at
a given instant, so its future motion cannot be determined, but only a range of
possibilities for the future motion of the particle can be described.
These notions arising from the uncertainty principle only arise at the subatomic level and
were a consequence of wave-particle duality. As counter-intuitive as they may seem,
quantum mechanical theory with its uncertainty principle has been responsible for major
improvements in the world's technology from computer components to fluorescent lights
to brain scanning techniques
Quantum entanglement
published a paper explaining the argument which had been denoted the EPR
paradox (Einstein-Podolsky-Rosen, 1935). Einstein showed that the Copenhagen
Interpretation predicted quantum entanglement which he was trying to prove was
incorrect in that it would defy the law of physics that nothing could travel faster than the
speed of light. Quantum entanglement means that when there is a change in one particle
at a distance from another particle then the other particle automatically changes to
counter-balance the system. In quantum entanglement, the act of measuring one
entangled particle defines its properties and seems to influence the properties of its
partner or partners instantaneously, no matter how far apart they are. Because the two
particles are in an entangled state, changes to the one cause instantaneous effects on the
other. Einstein had calculated that quantum theory would predict this, he saw it as a flaw
and therefore challenged it. However, instead of showing a weakness in quantum
mechanics, this forced quantum mechanics to acknowledge that quantum entanglement
did in fact exist and it became another foundation theory of quantum mechanics. The
1935 paper is currently Einstein's most cited publication in physics journals.
Bohr's original response to Einstein was that the particles were in a system. However,
Einstein's challenge led to decades of substantial research into this quantum mechanical
phenomenon of quantum entanglement. This research clarified by Yanhua Shih points out
that the two entangled particles can be viewed as somehow not separate, which removes
the locality objection[25]. This means that no matter the distance between the entangled
particles, they remain in the same quantum state so that one particle is not sending
information to another particle faster than the speed of light, but rather a change to one
particle is a change to the entire system or quantum state of the entangled particles and
therefore changes the state of the system without information transference.
As a system becomes larger or more massive (action >> h ) the classical picture tends to
emerge, with some exceptions, such as superfluidity. The emergence of behaviour as we
scale up that matches our classical intuition is called the correspondence principle and is
based on Ehrenfest's theorem. This why we can usually ignore quantum mechanics when
dealing with everyday objects. Even so, trying to make sense of quantum theory is an
ongoing process which has spawned a number of interpretations of quantum theory,
ranging from the conventional Copenhagen Interpretation to hidden variables & many
worlds. There seems to be no end in sight to the philosophical musings on the subject;
however the empirical or technical success of the theory is unrivalled; all modern
fundamental physical theories are quantum theories.
Quantum Electrodynamics
In classical physics, due to interference light is observed to take the stationary path
between two points; but how does light `know where it's going'? That is, if the start and
end points are known, the path that will take the shortest time can be calculated.
However, when light is first emitted, the end point is not known, so how is it that light
always takes the quickest path? The answer is provided by QED. Light doesn't know
where it is going, and it doesn't always take the quickest path. According to QED light
does not have to — it simply goes over every possible path, and the observer (at a
particular location) simply detects the mathematical result of all wave functions added up
(as a sum of all line integrals). In fact, according to QED, light can go slower or faster
than the speed of light to get there[1].
Physically, QED describes charged particles (and their antiparticles) interacting with each
other by the exchange of photons. The magnitude of these interactions can be computed
using perturbation theory; these rather complex formulas have a remarkable pictorial
representation as Feynman diagrams [1]. QED was the theory to which Feynman
diagrams were first applied. These diagrams were invented on the basis of Lagrangian
mechanics. Using a Feynman diagram, one decides every possible path between the start
and end points. Each path is assigned a complex-valued probability amplitude, and the
actual amplitude we observe is the sum of all amplitudes over all possible paths.
Obviously, among all possible paths the ones with stationary phase contribute most (due
to lack of destructive interference with some neighboring counter-phase paths) — this
results in the stationary classical path between the two points.
QED doesn't predict what will happen in an experiment, but it can predict the probability
of what will happen in an experiment, which is how it is experimentally verified.
Predictions of QED agree with experiments to an extremely high degree of accuracy:
currently about 10−12 (and limited by experimental errors). This makes QED the most
accurate physical theory constructed thus far.
Near the end of his life, Richard Feynman gave a series of lectures on QED
intended for the lay public. These lectures were transcribed and published as Feynman
(1985), QED: The strange theory of light and matter, a classic nonmathematical
exposition of QED from the point of view articulated above. See also QED (book).
Much of Feynman's discussion springs from an everyday phenomenon: the way any sheet
of glass partly reflects any light shining on it. (The book's cover featured a beautiful
photograph of an iridescent soap bubble, another striking example of an interference-
based phenomenon (illustrating addition of wave functions, a central principle of QED).
Feynman also pays homage to Isaac Newton's struggles to come to terms with the nature
of light.
History
Quantum theory began in 1900, when Max Planck assumed that energy is quantized in
order to derive a formula predicting the observed frequency dependence of the energy
emitted by a black body. This dependence is completely at variance with classical
physics. In 1905, Einstein explained the photoelectric effect by postulating that light
energy comes in quanta called photons. In 1913, Bohr invoked quantization in his
proposed explanation of the spectral lines of the hydrogen atom. In 1924, Louis de
Broglie proposed a quantum theory of the wave-like nature of subatomic particles. The
phrase "quantum physics" was first employed in Johnston's Planck's Universe in Light of
Modern Physics. These theories, while they fit the experimental facts to some extent,
were strictly phenomenological: they provided no rigorous justification for the
quantization they employed. They are collectively known as the "old quantum theory."
Modern quantum mechanics was born in 1925 with Werner Heisenberg's matrix
mechanics and Erwin Schrödinger's wave mechanics and the Schrödinger equation.
Schrödinger subsequently showed that these two approaches were equivalent. In 1927,
Heisenberg formulated his uncertainty principle, and the Copenhagen interpretation of
quantum mechanics began to take shape. Around this time, Paul Dirac, in work
culminating in his 1930 monograph, joined quantum mechanics and special relativity,
pioneered the use of operator theory, and devised the bra-ket notation widely used since.
In 1932, John von Neumann formulated the rigorous mathematical basis for quantum
mechanics as the theory of linear operators on Hilbert spaces. This and other work from
the founding period remains valid and widely used.
Quantum chemistry began with Walter Heitler and Fritz London's 1927 quantum account
of the covalent bond of the hydrogen molecule. Linus Pauling and others contributed to
the subsequent development of quantum chemistry.
The application of quantum mechanics to fields rather than single particles, resulting in
what are known as quantum field theories, began in 1927. Early contributors included
Dirac, Wolfgang Pauli, Weisskopf, and Jordan. This line of research culminated in the
1940s in the quantum electrodynamics (QED) of Richard Feynman, Freeman Dyson,
Julian Schwinger, and Sin-Itiro Tomonaga, for which Feynman, Schwinger and
QED involves a covariant and gauge invariant prescription for the calculation of
observable quantities. Feynman's mathematical technique, based on his diagrams,
initially seemed very different from the field-theoretic, operator-based approach of
Schwinger and Tomonaga, but Freeman Dyson later showed that the two approaches
were equivalent. The renormalization procedure for eliminating the awkward infinite
predictions of quantum field theory was first implemented in QED. Even though
renormalization works very well in practice, Feynman was never entirely comfortable
with its mathematical validity, even referring to renormalization as a "shell game" and
"hocus pocus". (Feynman, 1985: 128)
QED has served as a role model and template for all subsequent quantum field theories.
One such subsequent theory is quantum chromodynamics, which began in the early
1960s and attained its present form in the 1975 work by H. David Politzer, David Gross
and Frank Wilczek. Building on the pioneering work of Schwinger, Peter Higgs,
Goldstone, and others, Sheldon Glashow, Steven Weinberg and Abdus Salam
independently showed how the weak nuclear force and quantum electrodynamics could
be merged into a single electroweak force.
Butterfly effect
The phrase refers to the idea that a butterfly's wings might create tiny changes in the
atmosphere that ultimately cause a tornado to appear (or, for that matter, prevent a
tornado from appearing). The flapping wing represents a small change in the initial
condition of the system, which causes a chain of events leading to large-scale
phenomena. Had the butterfly not flapped its wings, the trajectory of the
system might have been vastly different.
Recurrence, the approximate return of a system towards its initial conditions, together
with the sensitive dependence on initial conditions, are the two main ingredients for
chaotic motion. They have the practical consequence of making complex systems, such
as the weather, difficult to predict past a certain time range—approximately a week, in
the case of weather.
The concept of the Butterfly effect is sometimes used in popular media dealing with the
idea of time travel, usually inaccurately. In the 1952 short story by Ray Bradbury, "A
Sound of Thunder", the killing of a butterfly during the time of dinosaurs causes the
future to change in subtle but meaningful ways: e.g., the spelling of English, and the
outcome of a political election. According to the actual theory, however, the mere
presence of the time travelers in the past would be enough to change short-term events
(such as the weather), and would also have an unpredictable impact on the distant future.
In a Simpsons episode about Homer going back to the time of dinosaurs with a time
machine (a la Bradbury's story), Homer commits intentional and unintentional violence in
the past, violence which drastically changes the future (i.e., Homer's present).
In many cases, minor and seemingly inconsequential actions in the past are extrapolated
over time and can have radical effects on the present time of the main characters. In the
movie The Butterfly Effect, Evan Treborn (Ashton Kutcher), when reading from his
adolescent journals, is able to essentially "redo" parts of his past. As he continues to do
this, he realizes that even though his intentions are good, the actions he takes always have
unintended consequences. However, this movie does not seriously explore the
implications of the butterfly effect; only the lives of the principal characters seem to
change from one scenario to another. The greater world around them is mostly
unaffected.
Another movie which explores the butterfly effect (though not advertised as such) is
Sliding Doors. The movie observes two parallel life paths of a woman named Helen,
played by Gwyneth Paltrow. These two paths diverge when Helen attempts to catch a
commuter train. In one life path she catches the train, and in another she is delayed for
just a few seconds and barely misses the train. This results in two
dramatically different sets of events.
The Butterfly effect was also invoked by fictional mathematician Ian Malcolm in both the
novel and film versions of Jurassic Park. He used it to explain the inherent instability of
(among other things) an amusement park with dinosaurs as the attraction - although this
interpretation can also be taken to mean that zoo animals will always escape and kill their
captors.
Fractal
The word "fractal" has two related meanings. In colloquial usage, it denotes a shape that
is recursively constructed or self-similar, that is, a shape that appears similar at all scales
of magnification and is therefore often referred to as "infinitely complex." In mathematics
a fractal is a geometric object that satisfies a specific technical condition, namely having
a Hausdorff dimension greater than its topological dimension. The term fractal was
coined in 1975 by Benoît Mandelbrot, from the Latin fractus, meaning "broken" or
"fractured."
History
Objects that are now called fractals were discovered and explored long before the word
was coined. Ethnomathematics like Ron Eglash's African Fractals (ISBN 0-8135-2613-2)
documents pervasive fractal geometry in indigeneous African craft. In 1525, the German
Artist Albrecht Dürer published The Painter's Manual, in which one section is on "Tile
Patterns formed by Pentagons." The Dürer's Pentagon largely resembled the Sierpinski
carpet, but based on pentagons instead of squares.
The idea of "recursive self similarity" was originally developed by the philosopher
Leibniz and he even worked out many of the details. In 1872, Karl Weierstrass found an
example of a function with the non-intuitive property that it is everywhere continuous but
nowhere differentiable — the graph of this function would now be called a fractal. In
1904, Helge von Koch, dissatisfied with Weierstrass's very abstract and analytic
definition, gave a more geometric definition of a similar function, which is now called the
Koch snowflake. The idea of self-similar curves was taken further by Paul Pierre Lévy
who, in his 1938 paper Plane or Space Curves and Surfaces Consisting of
Parts Similar to the Whole, described a new fractal curve, the Lévy C curve.
Georg Cantor gave examples of subsets of the real line with unusual properties — these
Cantor sets are also now recognised as fractals. Iterated functions in the complex plane
had been investigated in the late 19th and early 20th centuries by Henri Poincaré, Felix
Klein, Pierre Fatou, and Gaston Julia. However, without the aid of modern computer
graphics, they lacked the means to visualize the beauty of many of the objects that they
had discovered.
Examples
A relatively simple class of examples is the Cantor sets, in which short and then shorter
(open) intervals are struck out of the unit interval [0, 1], leaving a set that might (or might
not) actually be self-similar under enlargement, and might (or might not) have dimension
d that has 0 < d < 1. A simple recipe, such as excluding the digit 7 from decimal
representations, is self-similar under 10-fold enlargement, and also has dimension log
9/log 10 (this value is the same, no matter what logarithmic base is chosen), showing the
connection of the two concepts.
Additional examples of fractals include the Lyapunov fractal, Sierpinski triangle and
carpet, Menger sponge, dragon curve, space-filling curve, limit sets of Kleinian groups,
and the Koch curve. Fractals can be deterministic or stochastic (i.e. non-deterministic).
Chaotic dynamical systems are sometimes associated with fractals. Objects in the phase
space of a dynamical system can be fractals (see Attractor). Objects in the parameter
space for a family of systems may be fractal as well. An interesting example is the
Mandelbrot set. This set contains whole discs, so it has a dimension of 2 and
is not technically fractal—but what is truly surprising is that the boundary of the
Mandelbrot set also has a Hausdorff dimension of 2. (M. Shishikura proved that in 1991.)
The following analysis of the Koch Snowflake suggests how self-similarity can be used
to analyze fractal properties.
The total length of a number, N, of small steps, L, is the product NL. Applied to the
boundary of the Koch snowflake this gives a boundless length as L approaches zero. But
this distinction is not satisfactory, as different Koch snowflakes do have different sizes. A
solution is to measure, not in meter, m, nor in square meter, m², but in some other power
of a meter, mx. Now 4N(L/3)x = NLx, because a three times shorter steplength requires
four times as many steps, as is seen from the figure. Solving that equation gives x = (log
4)/(log 3) ≈ 1.26186. So the unit of measurement of the boundary of the Koch snowflake
is approximately m1.26186.
[edit]
Generating fractals
Even 2000 times magnification of the Mandelbrot set uncovers fine detail resembling the full set.
Iterated function systems — These have a fixed geometric replacement rule. Cantor set,
Sierpinski carpet, Sierpinski gasket, Peano curve, Koch snowflake, Harter-Heighway
dragon curve, T-Square, Menger sponge, are some examples of such fractals.
Classification of fractals
Fractals can also be classified according to their self-similarity. There are three types of
self-similarity found in fractals:
Exact self-similarity — This is the strongest type of self-similarity; the fractal appears
identical at different scales. Fractals defined by iterated function systems often display
exact self-similarity.
Quasi-self-similarity — This is a loose form of self-similarity; the fractal appears
approximately (but not exactly) identical at different scales. Quasi-self-similar fractals
contain small copies of the entire fractal in distorted and degenerate forms. Fractals
defined by recurrence relations are usually quasi-self-similar but not exactly self-similar.
Statistical self-similarity — This is the weakest type of self-similarity; the fractal has
numerical or statistical measures which are preserved across scales. Most reasonable
definitions of “fractal” trivially imply some form of statistical self-similarity. (Fractal
dimension itself is a numerical measure which is preserved across scales.) Random
fractals are examples of fractals which are statistically self-similar, but neither exactly
nor quasi-self-similar.
It should be noted that not all self-similar objects are fractals — e.g., the real line (a
straight Euclidean line) is exactly self-similar, but since its Hausdorff dimension and
Fractals in nature
Chaos theory
In mathematics and physics, chaos theory describes the behavior of certain nonlinear
dynamical systems that under certain conditions exhibit a phenomenon known as chaos.
Among the characteristics of chaotic systems, described below, is sensitivity to initial
conditions (popularly referred to as the butterfly effect). As a result of this sensitivity, the
behavior of systems that exhibit chaos appears to be random, even though the system is
deterministic in the sense that it is well defined and contains no random parameters.
Examples of such systems include the atmosphere, the solar system, plate tectonics,
turbulent fluids, economics, and population growth.
Systems that exhibit mathematical chaos are deterministic and thus orderly in some
sense; this technical use of the word chaos is at odds with common parlance, which
suggests complete disorder. (See the article on mythological chaos for a discussion of the
origin of the word in mythology, and other uses.) A related field of physics called
quantum chaos theory studies non-deterministic systems that follow the laws of quantum
mechanics.
For a dynamical system to be classified as chaotic, most scientists will agree that it must
have the following properties:
Sensitivity to initial conditions means that each point in such a system is arbitrarily
closely approximated by other points with significantly different future trajectories. Thus,
an arbitrarily small perturbation of the current trajectory may lead to significantly
different future behavior.
History
The first discoverer of chaos can plausibly be argued to be Jacques Hadamard, who in
1898 published an influential study of the chaotic motion of a free particle gliding
frictionlessly on a surface of constant negative curvature. In the system studied,
Hadamard's billiards, Hadamard was able to show that all trajectories are unstable, in that
all particle trajectories diverge exponentially from one-another, with positive Lyapunov
exponent. In the early 1900s, Henri Poincaré while studying the three-body problem,
found that there can be orbits which are nonperiodic, and yet not forever increasing nor
approaching a fixed point. Much of the early theory was developed almost entirely by
mathematicians, under the name of ergodic theory. Later studies, also on the topic of
nonlinear differential equations, were carried out by G.D. Birkhoff, A.N. Kolmogorov,
M.L. Cartwright, J.E. Littlewood, and Stephen Smale. Except for Smale, these studies
were all directly inspired by physics: the three-body problem in the case of Birkhoff,
turbulence and astronomical problems in the case of Kolmogorov, and radio engineering
in the case of Cartwright and Littlewood. Although chaotic planetary motion had not
been observed, experimentalists had encountered turbulence in fluid motion and
nonperiodic oscillation in radio circuits without the benefit of a theory to explain what
they were seeing.
Chaos theory progressed more rapidly after mid-century, when it first became evident for
some scientists that linear theory, the prevailing system theory at that time, simply could
not explain the observed behavior of certain experiments like that of the logistic map.
The main catalyst for the development of chaos theory was the electronic computer.
Much of the mathematics of chaos theory involves the repeated iteration of simple
mathematical formulas, which would be impractical to do by hand. Electronic computers
made these repeated calculations practical. One of the earliest electronic digital
computers, ENIAC, was used to run simple weather forecasting models.
An early pioneer of the theory was Edward Lorenz whose interest in chaos came about
accidentally through his work on weather prediction in 1961. Lorenz was using a basic
computer, a Royal McBee LGP-30, to run his weather simulation. He wanted to see a
sequence of data again and to save time he started the simulation in the middle of its
To his surprise the weather that the machine began to predict was completely different
from the weather calculated before. Lorenz tracked this down to the computer printout.
The printout rounded variables off to a 3-digit number, but the computer worked with 6-
digit numbers. This difference is tiny and the consensus at the time would have been that
it should have had practically no effect. However Lorenz had discovered that small
changes in initial conditions produced large changes in the long-term outcome.
The term chaos as used in mathematics was coined by the applied mathematician James
A. Yorke.
The availability of cheaper, more powerful computers broadens the applicability of chaos
theory. Currently, chaos theory continues to be a very active area of research.
Bifurcation theory
Extended Consciousness
Bell's theorem
Bell's theorem is the most famous legacy of the late John Bell. It is notable for showing
that the predictions of quantum mechanics (QM) differ from those of intuition. It is
simple and elegant, and touches upon fundamental philosophical issues that relate to
modern physics. In its simplest form, Bell's theorem states:
No physical theory of local hidden variables can ever reproduce all of the
predictions of quantum mechanics.
This theorem has even been called "the most profound in science" (Stapp, 1975). Bell's
seminal 1965 paper was entitled "On the Einstein Podolsky Rosen paradox". The Einstein
Podolsky Rosen paradox (EPR paradox) assumes local realism, the intuitive notion that
particle attributes have definite values independent of the act of observation and that
physical effects have a finite propagation speed. Bell showed that local realism leads to a
requirement for certain types of phenomena that are not present in quantum mechanics.
This requirement is called Bell's inequality.
The inequalities concern measurements made by observers (often called Alice and Bob)
on entangled pairs of particles that have interacted and then separated. Hidden variable
assumptions limit the correlation of subsequent measurements of the particles. Bell
discovered that under quantum mechanics this correlation limit may be violated.
Quantum mechanics lacks local hidden variables associated with individual particles, and
so the inequalities do not apply to it. Instead, it predicts correlation due to quantum
entanglement of the particles, allowing their state to be well defined only after a
measurement is made on either particle. That restriction agrees with the Heisenberg
uncertainty principle, one of the most fundamental concepts in quantum mechanics.
Per Bell's theorem, either quantum mechanics or local realism is wrong. Experiments
were needed to determine which is correct, but it took many years and many
improvements in technology to perform them.
Bell test experiments to date overwhelmingly show that the inequalities of Bell's theorem
are violated. This provides empirical evidence against local realism and demonstrates that
some of the "spooky action at a distance" suggested by the famous Einstein Podolsky
Rosen (EPR) thought experiment do in fact occur. They are also taken as positive
evidence in favor of QM. The principle of special relativity is saved by the no-
communication theorem, which proves that the observers cannot use the
inequality violations to communicate information to each other faster than the speed of
light.
John Bell's papers examined both John von Neumann's 1932 proof of the incompatibility
of hidden variables with QM and Albert Einstein and his colleagues' seminal 1935 paper
on the subject.
After EPR, quantum mechanics was left in the unsatisfactory position that it was either
incomplete in the sense that it failed to account for some elements of physical reality, or
it violated the principle of finite propagation speed of physical effects. In the EPR
thought experiment, two observers, now commonly referred to as Alice and Bob, perform
independent measurements of spin on a pair of electrons, prepared at a source in a special
state called a spin singlet state. It was a conclusion of EPR that once Alice measured spin
in one direction (e.g. on the x axis), Bob's measurement in that direction was determined
with certainty, whereas immediately before Alice's measurement, Bob's outcome was
only statistically determined. Thus, either the spin in each direction is not an element of
physical reality or the effects travel from Alice to Bob instantly.
The desire for a local realist theory was based on two ideas: first, that objects have a
definite state that determines the values of all other measurable properties such as
position and momentum and second, that (as a result of special relativity) effects of local
actions such as measurements cannot travel faster than the speed of light. In the
formalization of local realism used by Bell, the predictions of a theory result from the
application of classical probability theory to an underlying parameter space. By a simple
(but clever) argument based on classical probability he then showed that correlations
between measurements are bounded in a way that is violated by QM.
Bell's theorem seemed to seal the fate of those that had local realist hopes for QM.
Bell considered a setup in which two observers, Alice and Bob, perform independent
measurements on a system S prepared in some fixed state. Each observer has a detector
with which to make measurements. On each trial, Alice and Bob can
independently choose between various detector settings. Alice can choose a detector
setting a to obtain a measurement A(a) and Bob can choose a detector setting b to
measure B(b). After repeated trials Alice and Bob collect statistics on their measurements
and correlate the results.
There are two key assumptions in Bell's analysis: (1) each measurement reveals an
objective physical property of the system (2) a measurement taken by one observer has
no effect on the measurement taken by the other.
Notable quotes
Some recent popularizers of Bell's work when confronted with [Bell's inequality]
have gone on to claim that telepathy is verified or the mystical notion that all parts
of the universe are instantaneously interconnected is vindicated. Others assert that
this implies communication faster than the speed of light. That is rubbish; the
quantum theory and Bell's inequality imply nothing of this kind. Individuals who
make such claims have substituted a wish-fulfilling fantasy for understanding. If
we closely examine Bell's experiment we will see a bit of sleight of hand by the
God that plays dice which rules out actual nonlocal influences. Just as we think
we have captured a really weird beast--like acausal influences--it slips out of our
grasp. The slippery property of quantum reality is again manifested."
Bell's inequalities are tested by "coincidence counts" from a Bell test experiment such as
the optical one shown in the diagram. Pairs of particles are emitted as a result of a
quantum process, analysed with respect to some key property such as polarisation
direction, then detected. The setting (orientations) of the analysers are selected by the
experimenter.
Bell test experiments to date overwhelmingly suggest that Bell's inequality is violated.
Indeed, a table of Bell test experiments performed prior to 1986 is given in 4.5 of
(Redhead, 1987). Of the thirteen experiments listed, only two reached results
Nevertheless, the issue is not conclusively settled. According to Shimony's 2004 Stanford
Encyclopedia overview article
Some advocates of the hidden variables idea prefer to accept the opinion that experiments
have ruled out local hidden variables. They are ready to give up locality (and probably
also causality), explaining the violation of Bell's inequality by means of a "non-local"
hidden variable theory, in which the particles exchange information about their states.
This is the basis of the Bohm interpretation of quantum mechanics. It is, however,
Finally, one subtle assumption of the Bell inequalities is counterfactual definiteness. The
derivation refers to several objective properties that cannot all be measured for any given
particle, since the act of taking the measurement changes the state. Under local realism
the difficulty is readily overcome, so long as we can assume that the source is stable,
producing the same statistical distribution of states for all the subexperiments. If this
assumption is felt to be unjustifiable, though, one can argue that Bell's inequality is
unproven. In the Everett many-worlds interpretation, the assumption of counterfactual
definiteness is abandoned, this interpretation assuming that the universe branches into
many different observers, each of whom measures a different observation. Hence many
worlds can adhere to both the properties of philosophical realism and the principle of
locality and not violate Bell's conditions -- the only interpretation that can do this.
In physics, the principle of locality is that distant objects cannot have direct influence on
one another: an object is influenced directly only by its immediate surroundings. This
was stated as follows by Albert Einstein in his article "Quantum Mechanics and Reality"
("Quanten-Mechanik und Wirklichkeit", Dialectica 2:320-324, 1948):
“The following idea characterises the relative independence of objects far apart in space
(A and B): external influence on A has no direct influence on B; this is known as the
Principle of Local Action, which is used consistently only in field theory. If this axiom
were to be completely abolished, the idea of the existence of quasienclosed systems, and
thereby the postulation of laws which can be checked empirically in the accepted sense,
would become impossible.”
Local realism is the combination of the principle of locality with the "realistic"
assumption that all objects must objectively have their properties already before these
properties are observed. Einstein liked to say that the Moon is "out there" even when no
one is observing it.
Nonlocality
A Physical theory is said to exhibit strict nonlocality if in that theory it is not possible to
treat widely separated systems as independent. The term is most often reserved, however,
just for interaction supposed to occure outside the past light cone. Nonlocality does not
necessarily imply a lack of causality. For instance, Newtonian gravitation is nonlocal
because it involves instantaneous action-at-a-distance but Newtonian mechanics is
certainly causal. Effects that appear nonlocal in Quantum Mechanics, some physicists
say, actually obey locality; in these cases, the nonlocal interaction affects correlations that
are considered within the Copenhagen interpretation of quantum mechanics to pertain to
states of matter that result from the wave collapse upon measurement of irreal states
comprised of the sum of mutually exclsive possibilities, e.g., the singlet state. Einstein
criticised this interpretation of quantum mechanics on the grounds that these effects
employed "spooky instantaneous action at a distance". This issue is very closely related
to Bell's theorem and the EPR paradox. Quantum field theory, on the other hand, which is
the relativistic generalization of quantum mechanics, contains mathematical features that
assure locality, so that nonrelativistic quantum mechanics should be local as well. Thus,
the EPR paradox.
The Bohm interpretation always wants to preserve realism, and it needs to violate the
principle of locality to achieve the required correlations.
Because the differences between the different interpretations are mostly philosophical
ones (except for the Bohm and many-worlds interpretations), the physicists usually use
the language in which the important statements are independent of the interpretation we
choose. In this framework, only the measurable action at a distance - a superluminal
propagation of real, physical information - would be usually considered to be a violation
of locality by the physicists. Such phenomena have never been seen, and they are not
predicted by the current theories (with the possible exception of the Bohm theory).
Locality is one of the axioms of relativistic quantum field theory, as required for
causality. The formalization of locality in this case is as follows: if we have two
observables, each localized within two distinct spacetime regions which happen to be at a
spacelike separation from each other, the observables must commute. This interpretation
of the word "locality" is closely related to the relativistic version of in physics. In physics
a solution is local if the underlying equations are either Lorentz invariant or, more
generally, generally covariant or locally Lorentz invariant.
Tao
Tao or Dao refers to a Chinese character that was of pivotal meaning in ancient Chinese
philosophy and religion. Its most generic meaning, it refers to the "head path," and is
generally translated into English as "The Way".
The semantics of vary widely depending on the context, and may variously refer to a
concept of religion, morality, duty, knowledge, rationality, ultimate truth, path, or taste.
The CEDICT allows several different definition words for , as it varies in translation:
direction, way, method, road, path, principle, truth, reason, skill, method, Tao (of
Taoism), a measure word, to say, to speak, and to talk.
Tao is central to Taoism, but Confucianism also uses it to refer to "The Way," or the
"noble way" of personal conduct in life. The philosophic and religious use of the
character can be analyzed in two main segments: one meaning is "doctrine" or
"discourse"; every school owns and defends a specific Tao or discourse about doctrine. In
the other meaning, there is the 'Great Tao', that is the source of and guiding principle
behind all the processes of the universe. Beyond being and non-being, prior to space and
time, Tao is the intelligent ordering principle behind the unceasing flow of change in the
natural world. In this sense Tao gains great cosmological and metaphysical significance
comparable to the theistic concept of God; the Greek concept of the logos; or the Dharma
in Indian religions.
The nature and meaning of the Tao received its first full exposition in the Tao Te Ching
of Laozi, a work which along with those of Confucius and Mencius would have a far-
reaching effect on the intellectual, moral and religious life of the Chinese
people. Although a book of practical wisdom in many ways, its profoundly metaphysical
character was unique among the prevailing forms of thought in China at that time. The
religion and philosophy based on the teaching of Laozi and his successor Zhuangzi is
known in English as "Taoism." Even though the Tao is often said to be undefinable and
unexplainable with words (even Chinese ones), the present article focuses on the Tao of
Taoism
There is a flow in the universe, and it is called dao. Dao flows slowly, however; it is
never stagnant and is incredibly powerful and keeps things in the universe balanced and
in order. It manifests itself through change of seasons, cycle of life, shifts of power, time,
and so forth. Dao has a strong and deep connection with cosmology and the natural
world, as the most well-known Daoist philosophers Laozi and Zhuangzi agreed. Dao is
the law of Nature. When you follow dao, you become one with it. And it is best to also
understand chi, because chi and dao go hand in hand. Chi is a Chinese term that is
translated as breath, vapour, and energy. Because chi is the energy that circulates the
universe, it can be said that dao is ultimately a flow of chi. Being one with dao brings
best outcomes, because that way things fall into place that they are meant to be.
The concept of Tao is based upon the understanding that the only constant in the universe
is change, (ie. I Ching, the "Book of Changes") and that we must understand and be in
harmony with this change. The change is a constant flow from non-being into being,
potential into actual, yin into yang, female into male. The symbol of the Tao, called the
Taijitu, is the yin yang confluently flowing into itself in a circle.
The Tao is the main theme discussed in the Tao Te Ching, an ancient Chinese scripture
attributed to Lao Tsu. This book does not specifically define what the Tao is; it affirms
that in the first sentence, "The Tao that can be told of is not an Unvarying Tao" (tr.
Waley, modified). Instead, it points to some characteristics of what could be understood
as being the Tao. Below are some excerpts from the book.
• Tao as the origin of things: “Tao begets one; One begets two; Two begets three;
Three begets the myriad creatures.” (TTC 42, tr. Lau, modified)
• Tao as an inexhaustible nothingness: “The Way is like an empty vessel / That yet
may be drawn from / Without ever needing to be filled.” (TTC 4, tr. Waley)
• Tao is omnipotent and infallible: “What Tao plants cannot be plucked, what Tao
clasps, cannot slip.” (TTC 54, tr. Waley)
In the Yi Jing, a sentence closely relates Tao to Yin-Yang or Taiji, asserting that "one
(phase of) Yin, one (phase of) Yang, is what is called the Tao". Being thus placed at the
conjunction of Yin and Yang alternance, Tao can be understood as the continuity
principle that underlies the constant evolution of the world.
In science fiction, hyperspace is any region of space co-existing with our own universe
(in some cases displaced in an extra spatial dimension) which may be entered using some
sort of energy field or space-altering device. While hyperspace is in some way anchored
to the normal universe, its properties are not the same as normal space, so traveling in
hyperspace is largely inequivalent to traveling in normal space. This makes for a handy
explanation of faster than light (FTL) travel: while the shortest distance between two
points in normal space is a straight line, hyperspace allows those points to be closer
together, or a curved line in normal space to be straight, etc. Hyperspace is the most
common device used for explaining FTL in a science fiction story where FTL is
necessary for interstellar travel or intergalactic travel. Spacecraft able to use hyperspace
for FTL travel are said to have hyperdrive.
Subspace is a term used in many different science fiction media to explain many different
concepts. Most often, subspace is used as a means to justify faster-than-light transit, in
the form of interstellar travel or the transmission of information. Subspace is loosely
associated at times with certain ideas expressed in string theory, which state that the
Universe is not limited to four dimensions; there may be upwards of ten which we do not
readily perceive but affect us summarily. By exploiting these higher dimensions, thus
circumventing the limitations of the four we are most accustomed to, FTL speeds are
imitated (or potentially achieved). Subspace is also comparable to hyperspace (science
fiction); the two ideas are often interchangeable and applied in similar fashion.
In most Star Trek series, subspace communications are a means to (usually) establish
instantaneous contact with people and places that are light-years away. The physics of
Star Trek describe infinite speed (expressed as Warp 10) as an impossibility; as such even
subspace communications which putatively travel at speeds over Warp 9.9 may take
hours or weeks to reach certain destinations. Once the connection is made, however,
communication between the two points often becomes instantaneous.