Sie sind auf Seite 1von 35

Page 1 of 35

Quantum Physics Basics

Introduction
The following is extracted in the main from the on line eccyclopedia Wikipedia.
http://en.wikipedia.org and is intended to give a simple overview. In some cases this is by
analogy and this should not be taken as any lack of robustness to the concept.

For non English speakers the on line Wikipedia has an automatic translation facility.

There is also a listing of Quantum Physics and Quantum Biology related books available
in Biofeedback Resource Books.pdf available on http://imune.net/index/downloads.html

Introduction..................................................................................................................... 1
Thermodynamics............................................................................................................. 2
Entropy............................................................................................................................ 3
Quantum biology ............................................................................................................ 4
Quantum World: Instant Expert...................................................................................... 4
The birth of an idea ............................................................................................. 4
Quantum weirdness............................................................................................. 5
Uncertainty rules................................................................................................. 5
Secure networks .................................................................................................. 6
Quantum gravity ................................................................................................. 6
Economies of scale ............................................................................................. 7
Quantum Mechanics ....................................................................................................... 7
Background ................................................................................................................. 8
Old quantum theory .................................................................................................... 9
Planck's constant ......................................................................................................... 9
Bohr atom.................................................................................................................. 10
Wave-particle duality................................................................................................ 10
Development of modern quantum mechanics........................................................... 11
Schrödinger wave equation....................................................................................... 12
Uncertainty Principle ................................................................................................ 12
Quantum entanglement ............................................................................................. 14
Interpretations- the quantum micro and the newtonian macro world. ...................... 15
Consciousness causes collapse ................................................................................. 15
Quantum Electrodynamics............................................................................................ 16
Physical interpretation of QED................................................................................. 16
History....................................................................................................................... 17
Butterfly effect .............................................................................................................. 18
Fractal ........................................................................................................................... 20
History....................................................................................................................... 20
The fractional dimension of the boundary of the Koch snowflake........................... 22
Generating fractals .................................................................................................... 22
Classification of fractals ........................................................................................... 23
Fractals in nature....................................................................................................... 23

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 2 of 35

Chaos theory ................................................................................................................. 24


History....................................................................................................................... 25
Bifurcation theory ......................................................................................................... 26
Extended Consciousness................................................................................................. 27
Bell's theorem ............................................................................................................... 27
Importance of the theorem ........................................................................................ 28
Bell's thought experiment ......................................................................................... 28
Notable quotes .......................................................................................................... 29
Bell test experiments................................................................................................. 29
Implications of violation of Bell's inequality............................................................ 30
Locality (Local Universe) ............................................................................................. 31
Nonlocality.................................................................................................................... 32
Tao ................................................................................................................................ 33
Some characteristics of Tao ...................................................................................... 34
Subspace- Hyperspace (science fiction) ....................................................................... 35
Subspace (Star Trek)................................................................................................. 35

Thermodynamics

Thermodynamics (from the Greek thermos meaning heat and dynamis meaning power)
is a branch of physics that studies the effects of changes in temperature, pressure, and
volume on physical systems at the macroscopic scale by analyzing the collective motion
of their particles using statistics. Roughly, heat means "energy in transit" and dynamics
relates to "movement"; thus, in essence thermodynamics studies the movement of energy
and how energy instills movement. Historically, thermodynamics developed out of the
need to increase the efficiency of early steam engines.

The starting point for most thermodynamic considerations are the laws of
thermodynamics, which postulate that energy can be exchanged between physical
systems as heat or work. They also postulate the existence of a quantity named entropy,
which can be defined for any system. In thermodynamics, interactions between large
ensembles of objects are studied and categorized. Central to this are the concepts of
system and surroundings. A system is composed of particles, whose average motions
define its properties, which in turn are related to one another through equations of state.
Properties can be combined to express internal energy and thermodynamic potentials are
useful for determining conditions for equilibrium and spontaneous processes.

With these tools, thermodynamics describes how systems respond to changes in their
surroundings. This can be applied to a wide variety of topics in science and engineering,
such as engines, phase transitions, chemical reactions, transport phenomena, and even
black holes. The results of thermodynamics are essential for other fields of physics and
for chemistry, chemical engineering, cell biology, biomedical engineering, and materials
science to name a few.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 3 of 35

Entropy

In a system made up of quantities of matter, its pressure differences, density differences,


and temperature differences all tend to equalize over time. The system's entropy, which
increases with this process, is a measure of how far the equalization has progressed. For
example, take a system consisting of a cup of hot water in a cool room. Over time the
water will tend to cool and evaporate, and the room will warm up slightly. The system's
heat has become more evenly distributed, and thus the entropy of the cup of water and the
room has increased.

Entropy is often described as "a measure of the disorder of a thermodynamic system" or


"how mixed-up the system is". Such statements should be approached with care, as the
terms "disorder" and "mixedupedness" are not well defined. The "disorder" of the system
as a whole can be formally defined (as discussed below) in a way that is consistent with
the realities of entropy, but note that such a definition will almost always lead to
confusion. It is only if the word is used in this special sense that a system that is more
"disordered" or more "mixed up" on a molecular scale will necessarily also be "a system
with a lower amount of energy available to do work" or "a system in a macroscopically
more probable state".

The entropy of a thermodynamic system can be interpreted in two distinct, but


compatible, ways:

From a macroscopic perspective, in classical thermodynamics the entropy is interpreted


simply as a state function of a thermodynamic system: that is, a property depending only
on the current state of the system, independent of how that state came to be achieved. The
state function has the important property that, when multiplied by a reference
temperature, it can be understood as a measure of the amount of energy in a physical
system that cannot be used to do thermodynamic work; i.e., work mediated by thermal
energy. More precisely, in any process where the system gives up energy ΔE, and its
entropy falls by ΔS, a quantity at least TR ΔS of that energy must be given up to the
system’s surroundings as unusable heat (TR is the temperature of the system’s external
surroundings). Otherwise the process will not go forward.

From a microscopic perspective, in statistical thermodynamics the entropy is envisioned


as a measure of the number of microscopic configurations that are capable of yielding the
observed macroscopic description of the thermodynamic system. A more “disordered” or
“mixed up” system can thus be formally defined as one which has more microscopic
states compatible with the macroscopic description, however this definition is not
standard and thus prone to confusing people. It can be shown that this definition of
entropy, sometimes referred to as Boltzmann’s postulate, reproduces all of the properties
of the entropy of classical thermodynamics.

An important law of physics, the second law of thermodynamics, states that the total
entropy of any isolated thermodynamic system tends to increase over time, approaching a
maximum value. Unlike almost all other laws of physics, this associates thermodynamics

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 4 of 35

with a definite arrow of time. However, for a universe of infinite size, which
cannot be regarded as an isolated system, the second law does not apply.

Quantum biology

Quantum biology is the science of studying biological processes in terms of quantum


mechanics. In exploring quantum mechanical explanations for biological phenomena, the
nascent science of quantum biology represents one of the first efforts to apply quantum
theory to systems much more macroscopic than the atomic or subatomic realms generally
described by quantum theory.

The following biological phenomena have been studied in terms of quantum processes:

• the absorbance of frequency-specific radiation (i.e., photosynthesis and vision);


• the conversion of chemical energy into motion;
• magnetoreception in animals.

Quantum biological research is extremely limited by computer processing power; the


analytical power required to model quantum effects increases exponentially with the
number of particles involved.

Quantum World: Instant Expert


If successful scientific theories can be thought of as cures for stubborn problems,
quantum physics was the wonder drug of the 20 century. It successfully explained
th

phenomena such as radioactivity and antimatter, and no other theory can match its
description of how light and particles behave on small scales.

But it can also be mind-bending. Quantum objects can exist in multiple states and places
at the same time, requiring a mastery of statistics to describe them. Rife with uncertainty
and riddled with paradoxes, the theory has been criticised for casting doubt on the notion
of an objective reality - a concept many physicists, including Albert Einstein, have found
hard to swallow.

Today, scientists are grappling with these philosophical conundrums, trying to harness
quantum's bizarre properties to advance technology, and struggling to weave quantum
physics and general relativity into a seamless theory of quantum gravity.

The birth of an idea

Quantum theory began to take shape in the early 20th century, when classical ideas failed
to explain some observations. Previous theories allowed atoms to vibrate at any
frequency, leading to incorrect predictions that they could radiate infinite amounts of
energy - a problem known as the ultraviolet catastrophe.

In 1900, Max Planck solved this problem by assuming atoms can vibrate only at specific,
or quantised, frequencies. Then, in 1905, Einstein cracked the mystery of the

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 5 of 35

photoelectric effect, whereby light falling on metal releases electrons of


specific energies. The existing theory of light as waves failed to explain the effect, but
Einstein provided a neat solution by suggesting light came in discrete packages of energy
called photons - a brain wave that won him the Nobel Prize for Physics in 1921.

Quantum weirdness

In fact, light's chameleon-like ability to behave as either a particle or a wave, depending


on the experimental setup, has long stymied scientists. Danish physicist Niels Bohr
explained this wave-particle duality by doing away with the concept of a reality separate
from one's observations. In his "Copenhagen interpretation", Bohr argued that the very
act of measurement affects what we observe.

One controversial experiment recently challenged this either/or scenario of light by


apparently detecting evidence of both wave- and particle-like behaviour simultaneously.
The work suggests there may be no such thing as photons - light appears quantised only
because of the way it interacts with matter.

Other interpretations of quantum theory - of which there are at least half a dozen - deal
with the measurement problem by suggesting even more far-fetched concepts than a
universe dependent on measurement. The popular many worlds interpretation suggests
quantum objects display several behaviours because they inhabit an infinite number of
parallel universes.

Uncertainty rules

For about 70 years, this wave-particle duality was explained by another unsettling tenet
of quantum theory - the Heisenberg uncertainty principle. Formulated by Werner
Heisenberg in 1927 and recently made more precise, the theory puts an upper limit on
knowledge. It says one can never know both the position and momentum of a quantum
object - measuring one invariably changes the other.

Bohr defeated Einstein in a series of thought experiments in the 1920s and 1930s using
this principle, but more recent work suggests the underlying cause of the duality seen in
experiments is a phenomenon called entanglement.

Entanglement is the idea that in the quantum world, objects are not independent if they
have interacted with each other or come into being through the same process. They
become linked, or entangled, such that changing one invariably affects the other, no
matter how far apart they are - something Einstein called "spooky action at a distance".

This may be involved in superconductivity and may even explain why objects have mass.
It also holds promise for "teleporting" particles across vast distances - assuming everyone
agrees on a reference frame. The first teleportation of a quantum state occurred in 1998,
and scientists have been gradually entangling more and more particles, different kinds of
particles, and large particles.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 6 of 35

Secure networks

Entanglement may also provide a nearly uncrackable method of communication.


Quantum cryptographers can send "keys" to decode encrypted information using
quantum particles. Any attempt to intercept the particles will disturb their quantum state -
an interference that could then be detected.

In April 2004, Austrian financial institutions performed the first money transfer
encrypted by quantum keys, and in June, the first encrypted computer network with more
than two nodes was set up across 10 kilometres in Cambridge, Massachusetts, US.

But keeping quantum particles entangled is a tricky business. Researchers are working on
how to maximise the particles' signal and distance travelled. Using a sensitive photon
detector, researchers in the UK recently sent encrypted photons down the length of a 100-
kilometre fibre optic cable. Researchers in the US devised a scheme to entangle
successive clouds of atoms in the hopes of one day making a quantum link between the
US cities of Washington, DC, and New York.

Lightning-fast computers

Quantum computers are another long-term goal. Because quantum particles can exist in
multiple states at the same time, they could be used to carry out many calculations at
once, factoring a 300-digit number in just seconds compared to the years required by
conventional computers.

But to maintain their multi-state nature, particles must remain isolated long enough to
carry out the calculations - a very challenging condition. Nonetheless, some progress has
been made in this area. A trio of electrons, the building blocks of classical computers,
were entangled in a semiconductor in 2003, and the first quantum calculation was made
with a single calcium ion in 2002. In October 2004, the first quantum memory component
was built from a string of caesium atoms.

But particles of matter interact so easily with others that their quantum states are
preserved for very short times - just billionths of a second. Photons, on the other hand,
maintain their states about a million times longer because they are less prone to interact
with each other. But they are also hard to store, as they travel, literally, at the speed of
light.

In 2001, scientists managed to stop light in its tracks, overcoming one practical hurdle.
And the first quantum logic gate - the brains behind quantum computers - was created
with light in 2003.

Quantum gravity

While three of the four fundamental forces of nature - those operating on very small
scales - are well accounted for by quantum theory, gravity is its Achilles heel. This force
works on a much larger scale and quantum theory has been powerless so far to explain it.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 7 of 35

A number of bizarre theories have been proposed to bridge this gap, many
of which suggest that the very fabric of space-time bubbles up with random quantum
fluctuations - a foam of wormholes and infinitesimal black holes.

Such a foam is thought to have filled the universe during the big bang, dimpling space-
time so that structures such as stars and galaxies could later take shape.

The most popular quantum gravity theory says that particles and forces arise from the
vibrations of tiny loops - or strings - just 10-35 metres long. Another says that space and
time are discrete at the smallest scales, emerging from abstractions called "spin
networks".

One recent theory, called "doubly special relativity", tweaks Einstein's idea of one cosmic
invariant - the speed of light - and adds another at a very small scale. The controversial
theory accounts for gravity, inflation, and dark energy. Physicists are now devising
observations and experiments that could test the competing theories.

Economies of scale

Quantum physics is usually thought to act on light and particles smaller than molecules.
Some researchers believe there must be some cut-off point where classical physics takes
over, such as the point where the weak pull of gravity overwhelms other forces (in fact,
gravity's effect on neutrons was recently measured). But macroscopic objects can obey
quantum rules if they don't get entangled.

Certainly, harnessing troops of atoms or photons that follow quantum laws holds great
technological promise. Recent work cooling atoms to near absolute zero have produced
new forms of matter called Bose-Einstein and fermionic condensates. These have been
used to create laser beams made of atoms that etch precise patterns on surfaces, and
might one day lead to superconductors that work at room temperature.

All of these hopes suggest that, as queasy as quantum can be, it remains likely to be the
most powerful scientific cure-all for years to come.

Quantum Mechanics

Quantum mechanics is a physical science dealing with the behaviour of matter and
waves on the scale of atoms and subatomic particles. It also forms the basis for the
contemporary understanding of how large objects such as stars and galaxies, and
cosmological events such as the Big Bang, can be analyzed and explained. Its acceptance
by the general physics community is due to its accurate prediction of the physical
behaviour of systems, including systems where Newtonian mechanics fails. This
difference between the success of classical and quantum mechanics is most often
observed in systems at the atomic scale or smaller, or at very low or very high energies,
or at extremely low temperatures. Quantum mechanics is the basis of modern

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 8 of 35

developments in chemistry, molecular biology, and electronics, and the


foundation for the technology that has transformed the world in the last fifty years.

Background

Through a century of experimentation and applied science, quantum mechanical theory


has proven to be very successful and practical. The term "quantum mechanics" was first
coined by Max Born in 1924. Quantum mechanics is the foundation for other sciences
including condensed matter physics, quantum chemistry, and particle physics.

Despite the success of quantum mechanics, it does have some controversial elements. For
example, the behaviour of microscopic objects described in quantum mechanics is very
different from our everyday experience, which may provoke an incredulous reaction.
Moreover, some of the consequences of quantum mechanics appear to be inconsistent
with the consequences of other successful theories, such as Einstein's Theory of
Relativity, especially general relativity.

Some of the background of quantum mechanics dates back to the early 1800's, but the
real beginnings of quantum mechanics date from the work of Max Planck in 1900[1].
Albert Einstein[2], Niels Bohr[3], and Louis de Broglie[4] soon made important
contributions. However, it was not until the mid-1920's that a more complete picture
emerged, and the true importance of quantum mechanics became clear. Some of the most
prominent scientists to contribute were Max Born[5], Paul Dirac[6], Werner Heisenberg[7],
Wolfgang Pauli[8], and Erwin Schrödinger[9].

Later, the field was further expanded with work by Julian Schwinger, Murray Gell-Mann,
and Richard Feynman, in particular, with the development of Quantum Electrodynamics
in 1947.

The interference that produces colored


bands on bubbles cannot be explained by a
model that depicts light as a particle. It can
be explained by a model that depicts it as a
wave. The drawing shows sine waves that
resemble waves on the surface of water
being reflected from two surfaces of a film
of varying width, but that depiction of the
wave nature of light is only a crude analogy.

Early researchers differed in their


explanations of the fundamental nature of
what we now call electromagnetic radiation.
In 1690, Christian Huygens explained the laws of reflection and refraction on the basis of
a wave theory.[10] Sir Isaac Newton believed that light consisted of particles which he

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 9 of 35

designated corpuscles. In 1827 Thomas Young and Augustin Fresnel made


experiments on interference that showed that a corpuscular theory of light was
inadequate. Then in 1873 James Clerk Maxwell showed that by making an electrical
circuit oscillate it should be possible to produce electromagnetic waves. His theory made
it possible to compute the speed of electromagnetic radiation purely on the basis of
electrical and magnetic measurements, and the computed value corresponded very
closely to the empirically measured speed of light.[11] In 1888, Heinrich Hertz made an
electrical device that actually produced what we would now call microwaves —
essentially radiation at a lower frequency than visible light.[12] Everything up to that point
suggested that Newton had been entirely wrong to regard light as corpuscular. Then it
was discovered that when light strikes an electrical conductor it causes electrons to move
away from their original positions, and, furthermore, the phenomenon observed could
only be explained if the light delivered energy in definite packets. In a photoelectric
device such as the light meter in a camera, when light hits the metallic detector electrons
are caused to move. Greater intensities of light at one frequency can cause more electrons
to move, but they will not move any faster. In contrast, higher frequencies of light can
cause electrons to move faster. So intensity of light controls the amperes of current
produced, but frequency of light controls the voltage produced. This appeared to raise a
contradiction when compared to sound waves and ocean waves, where only intensity was
needed to predict the energy of the wave. In the case of light, frequency appeared to
predict energy. Something was needed to explain this phenomenon and also to reconcile
experiments that had shown light to have a particle nature with experiments that had
shown it to have a wave nature.

Old quantum theory

Quantum mechanics developed from the study of electromagnetic waves through


spectroscopy which includes visible light seen in the colors of the rainbow, but also other
waves including the more energetic waves like ultraviolet light, x-rays, and gamma rays
plus the waves with longer wavelengths including infrared waves, microwaves and radio
waves. We are not, however, speaking of sound waves, but only of those waves that
travel at the speed of light. Also, when the word "particle" is used below, it always refers
to elementary or subatomic particles.

Planck's constant

Classical physics predicted that a black-body radiator would produce infinite energy, but
that result was not observed in the laboratory. If black-body radiation was dispersed into
a spectrum, then the amount of energy radiated at various frequencies rose from zero at
one end, peaked at a frequency related to the temperature of the radiating object, and then
fell back to zero. In 1900, Max Planck developed an empirical equation that could
account for the observed energy curves, but he could not harmonize it with classical
theory. He concluded that the classical laws of physics do not apply on the atomic scale
as had been assumed.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 10 of 35

Bohr atom

The Bohr model of the atom, showing electron


quantum jumping to ground state n=1

In 1897 the particle called the electron was


discovered. By means of the gold foil experiment
physicists discovered that matter is, volume for
volume, largely space. Once that was clear, it was
hypothesized that negative charge entities called
electrons surround positively charged nuclei. So at
first all scientists believed that the atom must be like
a miniature solar system. But that simple analogy
predicted that electrons would, within about one hundredth of a microsecond,[19] crash
into the nucleus of the atom. The great question of the early 20th century was, "Why do
electrons normally maintain a stable orbit around the nucleus?"

In 1913, Niels Bohr removed this substantial problem by applying the idea of discrete
(non-continuous) quanta to the orbits of electrons. This account became known as the
Bohr model of the atom. Bohr basically theorized that electrons can only inhabit certain
orbits around the atom. These orbits could be derived by looking at the spectral lines
produced by atoms.

Wave-particle duality

< Probability distribution of the Bohr atom

Niels Bohr determined that it is impossible to describe light


adequately by the sole use of either the wave analogy or of the
particle analogy. Therefore he enunciated the principle of
complementarity, which is a theory of pairs, such as the pairing
of wave and particle or the pairing of position and momentum.
Louis de Broglie worked out the mathematical consequences of
these findings. In quantum mechanics, it was found that
electromagnetic waves could react in certain experiments as
though they were particles and in other experiments as though they were waves. It was
also discovered that subatomic particles could sometimes be described as particles and
sometimes as waves. This discovery led to the theory of wave-particle duality by Louis-
Victor de Broglie in 1924, which states that subatomic entities have properties of both
waves and particles at the same time.

The Bohr atom model was enlarged upon with the discovery by de Broglie that the
electron has wave-like properties. In accord with de Broglie's conclusions, electrons can
only appear under conditions that permit a standing wave. A standing wave can be made
if a string is fixed on both ends and made to vibrate (as it would in a stringed instrument).

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 11 of 35

That illustration shows that the only standing waves that can occur are those
with zero amplitude at the two fixed ends. The waves created by a stringed instrument
appear to oscillate in place, simply changing crest for trough in an up-and-down motion.
A standing wave can only be formed when the wave's length fits the available vibrating
entity. In other words, no partial fragments of wave crests or troughs are allowed. In a
round vibrating medium, the wave must be a continuous formation of crests and troughs
all around the circle. Each electron must be its own standing wave in its own discrete
orbital.

Development of modern quantum mechanics

Werner Heisenberg developed the full quantum mechanical theory in 1925 at the young
age of 23. Following his mentor, Niels Bohr, Werner Heisenberg began to work out a
theory for the quantum behavior of electron orbitals. Because electrons could not be
observed in their orbits, Heisenberg went about creating a mathematical description of
quantum mechanics built on what could be observed, that is, the light emitted from atoms
in their characteristic atomic spectra. Heisenberg studied the electron orbital on the model
of a charged ball on a spring, an oscillator, whose motion is anharmonic (not quite
regular). For a picture of the behavior of a charged ball on a spring see: Vibrating
Charges. Heisenberg first explained this kind of observed motion in terms of the laws of
classical mechanics known to apply in the macro world, and then applied quantum
restrictions, discrete (non-continuous) properties, to the picture. Doing so causes gaps to
appear between the orbitals so that the mathematical description he formulated would
then represent only the electron orbitals predicted on the basis of the atomic spectra.

In 1925 Heisenberg published a paper (in Z. Phys. vol. 33, p. 879-893) entitled
"Quantum-mechanical re-interpretation of kinematic and mechanical relations." So ended
the old quantum theory and began the age of quantum mechanics. Heisenberg's paper
gave few details that might aid readers in determining how he actually contrived to get
his results for the one-dimensional models he used to form the hypothesis that proved so
useful. In his paper, Heisenberg proposed to "discard all hope of observing hitherto
unobservable quantities, such as the position and period of the electron," and restrict
himself strictly to actually observable quantities. He needed mathematical rules for
predicting the relations actually observed in nature, and the rules he produced worked
differently depending on the sequence in which they were applied. "It quickly became
clear that the non-commutativity (in general) of kinematical quantities in quantum theory
was the really essential new technical idea in the paper." (Aitchison, p. 5) But it was
unclear why this non-commutativity was essential. Could it have a physical
interpretation? At least the matter was made more palatable when Max Born discovered
that the Heisenberg computational scheme could be put in a more familiar form present in
elementary mathematics.

Heisenberg approached quantum mechanics from the historical perspective that treated an
electron as an oscillating charged particle. Bohr's use of this analogy had already allowed
him to explain why the radii of the orbits of electrons could only take on certain values. It
followed from this interpretation of the experimental results available and the quantum

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 12 of 35

theory that Heisenberg subsequently created that an electron could not be at


any intermediate position between two "permitted" orbits. Therefore electrons were
described as "jumping" from orbit to orbit. The idea that an electron might now be in one
place and an instant later be in some other place without having traveled between the two
points was one of the earliest indications of the "spookiness" of quantum phenomena.
Although the scale is smaller, the "jump" from orbit to orbit is as strange and unexpected
as would be a case in which someone stepped out of a doorway in London onto the
streets of Los Angeles

Schrödinger wave equation

Because particles could be described as waves, later in 1925 Erwin Schrödinger analyzed
what an electron would look like as a wave around the nucleus of the atom. Using this
model, he formulated his equation for particle waves. Rather than explaining the atom by
analogy to satellites in planetary orbits, he treated everything as waves whereby each
electron has its own unique wavefunction. A wavefunction is described in Schrödinger's
equation by three properties (later Paul Dirac added a fourth). The three properties were
(1) an "orbital" designation, indicating whether the particle wave is one that is closer to
the nucleus with less energy or one that is further from the nucleus with more energy, (2)
the shape of the orbital, i.e. an indication that orbitals were not just spherical but other
shapes, and (3) the magnetic moment of the orbital, which is a manifestation of force
exerted by the charge of the electron as it rotates around the nucleus.

These three properties were called collectively the wavefunction of the electron and are
said to describe the quantum state of the electron. "Quantum state" means the collective
properties of the electron describing what we can say about its condition at a given time.
For the electron, the quantum state is described by its wavefunction which is designated
in physics by the Greek letter ψ (psi, pronounced "sigh"). The three properties of
Schrödinger's equation that describe the wavefunction of the electron and therefore also
describe the quantum state of the electron as described in the previous paragraph are each
called quantum numbers

Uncertainty Principle

Main article: Uncertainty principle

In 1927, Heisenberg made a new discovery on the basis of his quantum theory that had
further practical consequences of this new way of looking at matter and energy on the
atomic scale. In Heisenberg's matrix mechanics formula, Heisenberg had encountered an
error or difference of h/2π between position and momentum. This represented a deviation
of one radian of a cycle when the particle-like aspects of the wave were examined.
Heisenberg analyzed this difference of one radian of a cycle and divided the difference or
deviation of one radian equally between the measurement of position and momentum.
This had the consequence of being able to describe the electron as a point particle in the
center of one cycle of a wave so that its position would have a standard deviation of plus
or minus one-half of one radian of the cycle (1/2 of h-bar). A standard deviation can be

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 13 of 35

either plus or minus the measurement i.e. it can add to the measurement or
subtract from it. In three-dimensions a standard deviation is a displacement in any
direction. What this means is that when a moving particle is viewed as a wave it is less
certain where the particle is. In fact, the more certain the position of a particle is known,
the less certain the momentum is known. This conclusion came to be called "Heisenberg's
Indeterminacy Principle," or Heisenberg's Uncertainty Principle. To understand the real
idea behind the uncertainty principle imagine a wave with its undulations, its crests and
troughs, moving along. A wave is also a moving stream of particles, so you have to
superimpose a stream of particles moving in a straight line along the middle of the wave.
An oscillating ball of charge creates a wave larger than its size depending upon the length
of its oscillation. Therefore, the energy of a moving particle is as large as the cycle of the
wave, but the particle itself has a location. Because the particle and the wave are the same
thing, then the particle is really located somewhere in the width of the wave. Its position
could be anywhere from the crest to the trough. The math for the uncertainty principle
says that the measurement of uncertainty as to the position of a moving particle is one-
half the width from the crest to the trough or one-half of one radian of a cycle in a wave.

For moving particles in quantum mechanics, there is simply a certain degree of exactness
and precision that is missing. You can be precise when you take a measurement of
position and you can be precise when you take a measurement of momentum, but there is
an inverse imprecision when you try to measure both at the same time as in the case of a
moving particle like the electron. In the most extreme case, absolute precision of one
variable would entail absolute imprecision regarding the other.

Heisenberg voice recording in an early lecture on the uncertainty principle pointing to a


Bohr model of the atom: "You can say, well, this orbit is really not a complete orbit.
Actually at every moment the electron has only an inactual position and an inactual
velocity and between these two inaccuracies there is an inverse correlation."

The consequences of the uncertainty principle were that the electron could no longer be
considered as in an exact location in its orbital. Rather the electron had to be described by
every point where the electron could possibly inhabit. By creating points of probable
location for the electron in its known orbital, this created a cloud of points in a spherical
shape for the orbital of a hydrogen atom which points gradually faded out nearer to the
nucleus and farther from the nucleus. This is called a probability distribution. Therefore,
the Bohr atom number n for each orbital became known as an n-sphere in the three
dimensional atom and was pictured as a probability cloud where the electron surrounded
the atom all at once.

This led to the further description by Heisenberg that if you were not making
measurements of the electron that it could not be described in one particular location but
was everywhere in the electron cloud at once. In other words, quantum mechanics cannot
give exact results, but only the probabilities for the occurrence of a variety of possible
results. Heisenberg went further and said that the path of a moving particle only comes
into existence once we observe it. However strange and counter-intuitive this assertion

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 14 of 35

may seem, quantum mechanics does still tell us the location of the electron's
orbital, its probability cloud. Heisenberg was speaking of the particle itself, not its orbital
which is in a known probability distribution.

It is important to note that although Heisenberg used infinite sets of positions for the
electron in his matrices, this does not mean that the electron could be anywhere in the
universe. Rather there are several laws that show the electron must be in one localized
probability distribution. An electron is described by its energy in Bohr's atom which was
carried over to matrix mechanics. Therefore, an electron in a certain n-sphere had to be
within a certain range from the nucleus depending upon its energy. This restricts its
location. Also, the number of places an electron can be is also called "the number of cells
in its phase space". The uncertainty principle set a lower limit to how finely one can chop
up classical phase space. Therefore, the number of places that an electron can be in its
orbital becomes finite due to the Uncertainty Principle. Therefore, an electron's location
in an atom is defined to be in its orbital and its orbital although being a probability
distribution does not extend out into the entire universe, but stops at the nucleus and
before the next n-sphere orbital begins and the points of the distribution are finite due to
the Uncertainty Principle creating a lower limit.

Classical physics had shown since Newton that if you know the position of stars and
planets and details about their motions that you can predict where they will be in the
future. For subatomic particles, Heisenberg denied this notion showing that due to the
uncertainty principle one cannot know the precise position and momentum of a particle at
a given instant, so its future motion cannot be determined, but only a range of
possibilities for the future motion of the particle can be described.

These notions arising from the uncertainty principle only arise at the subatomic level and
were a consequence of wave-particle duality. As counter-intuitive as they may seem,
quantum mechanical theory with its uncertainty principle has been responsible for major
improvements in the world's technology from computer components to fluorescent lights
to brain scanning techniques

Quantum entanglement

Albert Einstein rejected Heisenberg's Uncertainty Principle insofar as it seemed to imply


more than a necessary limitation on human ability to actually know what occurs in the
quantum realm. In a letter to Max Born in 1926, Einstein claimed that God "does not play
dice."[23] Heisenberg's quantum mechanics, based on Bohr's initial explanation, became
known as the Copenhagen Interpretation of quantum mechanics. Both Bohr and Einstein
spent many years defending and attacking this interpretation. After the 1930 Solvay
conference, Einstein never again challenged the Copenhagen interpretation on technical
points, but did not cease a philosophical attack on the interpretation, on the grounds of
realism and locality. Einstein, in trying to show that it was not a complete theory,
recognized that the theory predicted that two or more particles which have interacted in
the past exhibit surprisingly strong correlations when various measurements are made on
them.[24] Einstein called this "spooky action at a distance". In 1935, Schrödinger

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 15 of 35

published a paper explaining the argument which had been denoted the EPR
paradox (Einstein-Podolsky-Rosen, 1935). Einstein showed that the Copenhagen
Interpretation predicted quantum entanglement which he was trying to prove was
incorrect in that it would defy the law of physics that nothing could travel faster than the
speed of light. Quantum entanglement means that when there is a change in one particle
at a distance from another particle then the other particle automatically changes to
counter-balance the system. In quantum entanglement, the act of measuring one
entangled particle defines its properties and seems to influence the properties of its
partner or partners instantaneously, no matter how far apart they are. Because the two
particles are in an entangled state, changes to the one cause instantaneous effects on the
other. Einstein had calculated that quantum theory would predict this, he saw it as a flaw
and therefore challenged it. However, instead of showing a weakness in quantum
mechanics, this forced quantum mechanics to acknowledge that quantum entanglement
did in fact exist and it became another foundation theory of quantum mechanics. The
1935 paper is currently Einstein's most cited publication in physics journals.

Bohr's original response to Einstein was that the particles were in a system. However,
Einstein's challenge led to decades of substantial research into this quantum mechanical
phenomenon of quantum entanglement. This research clarified by Yanhua Shih points out
that the two entangled particles can be viewed as somehow not separate, which removes
the locality objection[25]. This means that no matter the distance between the entangled
particles, they remain in the same quantum state so that one particle is not sending
information to another particle faster than the speed of light, but rather a change to one
particle is a change to the entire system or quantum state of the entangled particles and
therefore changes the state of the system without information transference.

Interpretations- the quantum micro and the newtonian macro world.

As a system becomes larger or more massive (action >> h ) the classical picture tends to
emerge, with some exceptions, such as superfluidity. The emergence of behaviour as we
scale up that matches our classical intuition is called the correspondence principle and is
based on Ehrenfest's theorem. This why we can usually ignore quantum mechanics when
dealing with everyday objects. Even so, trying to make sense of quantum theory is an
ongoing process which has spawned a number of interpretations of quantum theory,
ranging from the conventional Copenhagen Interpretation to hidden variables & many
worlds. There seems to be no end in sight to the philosophical musings on the subject;
however the empirical or technical success of the theory is unrivalled; all modern
fundamental physical theories are quantum theories.

Consciousness causes collapse

Consciousness causes collapse is the speculative theory that observation by a conscious


observer is responsible for the wavefunction collapse. It is an attempt to solve the
Wigner's friend paradox by simply stating that collapse occurs at the first "conscious"
observer. Supporters claim this is not a revival of substance dualism, since (in a
ramification of this view) consciousness and objects are entangled and cannot be

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 16 of 35

considered as distinct. The consciousness causes collapse theory can be


considered as a speculative appendage to almost any interpretation of quantum mechanics
and most physicists reject it as unverifiable and introducing unnecessary elements into
physics

Quantum Electrodynamics

Quantum electrodynamics (QED) is a relativistic quantum field theory of


electromagnetism. QED mathematically describes all phenomena involving electrically
charged particles interacting by means of exchange by photons, whether the interaction is
between light and matter or between two charged particles. It has been called "the jewel
of physics" for its extremely accurate predictions of quantities like the anomalous
magnetic moment of the electron, and the Lamb shift of the energy levels of hydrogen.

Physical interpretation of QED

In classical physics, due to interference light is observed to take the stationary path
between two points; but how does light `know where it's going'? That is, if the start and
end points are known, the path that will take the shortest time can be calculated.
However, when light is first emitted, the end point is not known, so how is it that light
always takes the quickest path? The answer is provided by QED. Light doesn't know
where it is going, and it doesn't always take the quickest path. According to QED light
does not have to — it simply goes over every possible path, and the observer (at a
particular location) simply detects the mathematical result of all wave functions added up
(as a sum of all line integrals). In fact, according to QED, light can go slower or faster
than the speed of light to get there[1].

Physically, QED describes charged particles (and their antiparticles) interacting with each
other by the exchange of photons. The magnitude of these interactions can be computed
using perturbation theory; these rather complex formulas have a remarkable pictorial
representation as Feynman diagrams [1]. QED was the theory to which Feynman
diagrams were first applied. These diagrams were invented on the basis of Lagrangian
mechanics. Using a Feynman diagram, one decides every possible path between the start
and end points. Each path is assigned a complex-valued probability amplitude, and the
actual amplitude we observe is the sum of all amplitudes over all possible paths.
Obviously, among all possible paths the ones with stationary phase contribute most (due
to lack of destructive interference with some neighboring counter-phase paths) — this
results in the stationary classical path between the two points.

QED doesn't predict what will happen in an experiment, but it can predict the probability
of what will happen in an experiment, which is how it is experimentally verified.
Predictions of QED agree with experiments to an extremely high degree of accuracy:
currently about 10−12 (and limited by experimental errors). This makes QED the most
accurate physical theory constructed thus far.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 17 of 35

Near the end of his life, Richard Feynman gave a series of lectures on QED
intended for the lay public. These lectures were transcribed and published as Feynman
(1985), QED: The strange theory of light and matter, a classic nonmathematical
exposition of QED from the point of view articulated above. See also QED (book).

Much of Feynman's discussion springs from an everyday phenomenon: the way any sheet
of glass partly reflects any light shining on it. (The book's cover featured a beautiful
photograph of an iridescent soap bubble, another striking example of an interference-
based phenomenon (illustrating addition of wave functions, a central principle of QED).
Feynman also pays homage to Isaac Newton's struggles to come to terms with the nature
of light.

History

Quantum theory began in 1900, when Max Planck assumed that energy is quantized in
order to derive a formula predicting the observed frequency dependence of the energy
emitted by a black body. This dependence is completely at variance with classical
physics. In 1905, Einstein explained the photoelectric effect by postulating that light
energy comes in quanta called photons. In 1913, Bohr invoked quantization in his
proposed explanation of the spectral lines of the hydrogen atom. In 1924, Louis de
Broglie proposed a quantum theory of the wave-like nature of subatomic particles. The
phrase "quantum physics" was first employed in Johnston's Planck's Universe in Light of
Modern Physics. These theories, while they fit the experimental facts to some extent,
were strictly phenomenological: they provided no rigorous justification for the
quantization they employed. They are collectively known as the "old quantum theory."

Modern quantum mechanics was born in 1925 with Werner Heisenberg's matrix
mechanics and Erwin Schrödinger's wave mechanics and the Schrödinger equation.
Schrödinger subsequently showed that these two approaches were equivalent. In 1927,
Heisenberg formulated his uncertainty principle, and the Copenhagen interpretation of
quantum mechanics began to take shape. Around this time, Paul Dirac, in work
culminating in his 1930 monograph, joined quantum mechanics and special relativity,
pioneered the use of operator theory, and devised the bra-ket notation widely used since.
In 1932, John von Neumann formulated the rigorous mathematical basis for quantum
mechanics as the theory of linear operators on Hilbert spaces. This and other work from
the founding period remains valid and widely used.

Quantum chemistry began with Walter Heitler and Fritz London's 1927 quantum account
of the covalent bond of the hydrogen molecule. Linus Pauling and others contributed to
the subsequent development of quantum chemistry.

The application of quantum mechanics to fields rather than single particles, resulting in
what are known as quantum field theories, began in 1927. Early contributors included
Dirac, Wolfgang Pauli, Weisskopf, and Jordan. This line of research culminated in the
1940s in the quantum electrodynamics (QED) of Richard Feynman, Freeman Dyson,
Julian Schwinger, and Sin-Itiro Tomonaga, for which Feynman, Schwinger and

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 18 of 35

Tomonaga received the 1965 Nobel Prize in Physics. QED, a quantum


theory of electrons, positrons, and the electromagnetic field, was the first satisfactory
quantum description of a physical field and of the creation and annihilation of quantum
particles.

QED involves a covariant and gauge invariant prescription for the calculation of
observable quantities. Feynman's mathematical technique, based on his diagrams,
initially seemed very different from the field-theoretic, operator-based approach of
Schwinger and Tomonaga, but Freeman Dyson later showed that the two approaches
were equivalent. The renormalization procedure for eliminating the awkward infinite
predictions of quantum field theory was first implemented in QED. Even though
renormalization works very well in practice, Feynman was never entirely comfortable
with its mathematical validity, even referring to renormalization as a "shell game" and
"hocus pocus". (Feynman, 1985: 128)

QED has served as a role model and template for all subsequent quantum field theories.
One such subsequent theory is quantum chromodynamics, which began in the early
1960s and attained its present form in the 1975 work by H. David Politzer, David Gross
and Frank Wilczek. Building on the pioneering work of Schwinger, Peter Higgs,
Goldstone, and others, Sheldon Glashow, Steven Weinberg and Abdus Salam
independently showed how the weak nuclear force and quantum electrodynamics could
be merged into a single electroweak force.

Butterfly effect

The butterfly effect is a phrase that


encapsulates the more technical notion
of sensitive dependence on initial
conditions in chaos theory. Small
variations of the initial condition of a
dynamical system may produce large
variations in the long term behavior of
the system. This is sometimes presented
as esoteric behavior, but can be
exhibited by very simple systems: for
example, a ball placed at the crest of a
hill might roll into any of several valleys
depending on slight differences in initial
position.

The phrase refers to the idea that a butterfly's wings might create tiny changes in the
atmosphere that ultimately cause a tornado to appear (or, for that matter, prevent a
tornado from appearing). The flapping wing represents a small change in the initial
condition of the system, which causes a chain of events leading to large-scale

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 19 of 35

phenomena. Had the butterfly not flapped its wings, the trajectory of the
system might have been vastly different.

Recurrence, the approximate return of a system towards its initial conditions, together
with the sensitive dependence on initial conditions, are the two main ingredients for
chaotic motion. They have the practical consequence of making complex systems, such
as the weather, difficult to predict past a certain time range—approximately a week, in
the case of weather.

Sensitive dependence on initial conditions was first described in the literature by


Hadamard and popularized by Duhem's 1906 book. The term butterfly effect is related to
the work of Edward Lorenz, who in a 1963 paper for the New York Academy of Sciences
noted that "One meteorologist remarked that if the theory were correct, one flap of a
seagull's wings could change the course of weather forever." Later speeches and papers
by Lorenz used the more poetic butterfly. According to Lorenz, upon failing to provide a
title for a talk he was to present at the 139th meeting of the AAAS in 1972, Philip
Merilees concocted Does the flap of a butterfly’s wings in Brazil set off a tornado in
Texas? as a title.

The concept of the Butterfly effect is sometimes used in popular media dealing with the
idea of time travel, usually inaccurately. In the 1952 short story by Ray Bradbury, "A
Sound of Thunder", the killing of a butterfly during the time of dinosaurs causes the
future to change in subtle but meaningful ways: e.g., the spelling of English, and the
outcome of a political election. According to the actual theory, however, the mere
presence of the time travelers in the past would be enough to change short-term events
(such as the weather), and would also have an unpredictable impact on the distant future.

In a Simpsons episode about Homer going back to the time of dinosaurs with a time
machine (a la Bradbury's story), Homer commits intentional and unintentional violence in
the past, violence which drastically changes the future (i.e., Homer's present).

In many cases, minor and seemingly inconsequential actions in the past are extrapolated
over time and can have radical effects on the present time of the main characters. In the
movie The Butterfly Effect, Evan Treborn (Ashton Kutcher), when reading from his
adolescent journals, is able to essentially "redo" parts of his past. As he continues to do
this, he realizes that even though his intentions are good, the actions he takes always have
unintended consequences. However, this movie does not seriously explore the
implications of the butterfly effect; only the lives of the principal characters seem to
change from one scenario to another. The greater world around them is mostly
unaffected.

Another movie which explores the butterfly effect (though not advertised as such) is
Sliding Doors. The movie observes two parallel life paths of a woman named Helen,
played by Gwyneth Paltrow. These two paths diverge when Helen attempts to catch a
commuter train. In one life path she catches the train, and in another she is delayed for

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 20 of 35

just a few seconds and barely misses the train. This results in two
dramatically different sets of events.

The Butterfly effect was also invoked by fictional mathematician Ian Malcolm in both the
novel and film versions of Jurassic Park. He used it to explain the inherent instability of
(among other things) an amusement park with dinosaurs as the attraction - although this
interpretation can also be taken to mean that zoo animals will always escape and kill their
captors.

Fractal

The word "fractal" has two related meanings. In colloquial usage, it denotes a shape that
is recursively constructed or self-similar, that is, a shape that appears similar at all scales
of magnification and is therefore often referred to as "infinitely complex." In mathematics
a fractal is a geometric object that satisfies a specific technical condition, namely having
a Hausdorff dimension greater than its topological dimension. The term fractal was
coined in 1975 by Benoît Mandelbrot, from the Latin fractus, meaning "broken" or
"fractured."

History

A Koch snowflake is the limit of an infinite construction


that starts with a triangle and recursively replaces each line
segment with a series of four line segments that form a
triangular "bump". Each time new triangles are added (an
iteration), the perimeter of this shape grows by a factor of
4/3 and thus diverges to infinity with the number of
iterations. The length of the Koch snowflake's boundary is
therefore infinite, while its area remains finite. For this
reason, the Koch snowflake and similar constructions were
sometimes called "monster curves."

Objects that are now called fractals were discovered and explored long before the word
was coined. Ethnomathematics like Ron Eglash's African Fractals (ISBN 0-8135-2613-2)
documents pervasive fractal geometry in indigeneous African craft. In 1525, the German
Artist Albrecht Dürer published The Painter's Manual, in which one section is on "Tile
Patterns formed by Pentagons." The Dürer's Pentagon largely resembled the Sierpinski
carpet, but based on pentagons instead of squares.

The idea of "recursive self similarity" was originally developed by the philosopher
Leibniz and he even worked out many of the details. In 1872, Karl Weierstrass found an
example of a function with the non-intuitive property that it is everywhere continuous but
nowhere differentiable — the graph of this function would now be called a fractal. In
1904, Helge von Koch, dissatisfied with Weierstrass's very abstract and analytic
definition, gave a more geometric definition of a similar function, which is now called the
Koch snowflake. The idea of self-similar curves was taken further by Paul Pierre Lévy

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 21 of 35

who, in his 1938 paper Plane or Space Curves and Surfaces Consisting of
Parts Similar to the Whole, described a new fractal curve, the Lévy C curve.

Georg Cantor gave examples of subsets of the real line with unusual properties — these
Cantor sets are also now recognised as fractals. Iterated functions in the complex plane
had been investigated in the late 19th and early 20th centuries by Henri Poincaré, Felix
Klein, Pierre Fatou, and Gaston Julia. However, without the aid of modern computer
graphics, they lacked the means to visualize the beauty of many of the objects that they
had discovered.

In the 1960s, Benoît Mandelbrot started investigating self-similarity in papers such as


How Long Is the Coast of Britain? Statistical Self-Similarity and Fractional Dimension.
This built on earlier work by Lewis Fry Richardson. In 1975, Mandelbrot coined the
word fractal to denote an object whose Hausdorff-Besicovitch dimension is greater than
its topological dimension. (Please refer to the articles on these terms for precise
definitions.) He illustrated this mathematical definition with striking computer-
constructed visualizations. These images captured the popular imagination; many of them
were based on recursion, leading to the popular meaning of the term "fractal".

Examples

An example of an animated fractal.


A Julia set, a fractal related to
the Mandelbrot set

A relatively simple class of examples is the Cantor sets, in which short and then shorter
(open) intervals are struck out of the unit interval [0, 1], leaving a set that might (or might
not) actually be self-similar under enlargement, and might (or might not) have dimension
d that has 0 < d < 1. A simple recipe, such as excluding the digit 7 from decimal
representations, is self-similar under 10-fold enlargement, and also has dimension log
9/log 10 (this value is the same, no matter what logarithmic base is chosen), showing the
connection of the two concepts.

Additional examples of fractals include the Lyapunov fractal, Sierpinski triangle and
carpet, Menger sponge, dragon curve, space-filling curve, limit sets of Kleinian groups,
and the Koch curve. Fractals can be deterministic or stochastic (i.e. non-deterministic).

Chaotic dynamical systems are sometimes associated with fractals. Objects in the phase
space of a dynamical system can be fractals (see Attractor). Objects in the parameter
space for a family of systems may be fractal as well. An interesting example is the

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 22 of 35

Mandelbrot set. This set contains whole discs, so it has a dimension of 2 and
is not technically fractal—but what is truly surprising is that the boundary of the
Mandelbrot set also has a Hausdorff dimension of 2. (M. Shishikura proved that in 1991.)

The fractional dimension of the boundary of the Koch snowflake

The following analysis of the Koch Snowflake suggests how self-similarity can be used
to analyze fractal properties.

The total length of a number, N, of small steps, L, is the product NL. Applied to the
boundary of the Koch snowflake this gives a boundless length as L approaches zero. But
this distinction is not satisfactory, as different Koch snowflakes do have different sizes. A
solution is to measure, not in meter, m, nor in square meter, m², but in some other power
of a meter, mx. Now 4N(L/3)x = NLx, because a three times shorter steplength requires
four times as many steps, as is seen from the figure. Solving that equation gives x = (log
4)/(log 3) ≈ 1.26186. So the unit of measurement of the boundary of the Koch snowflake
is approximately m1.26186.

[edit]

Generating fractals

Even 2000 times magnification of the Mandelbrot set uncovers fine detail resembling the full set.

Three common techniques for generating fractals are:

Iterated function systems — These have a fixed geometric replacement rule. Cantor set,
Sierpinski carpet, Sierpinski gasket, Peano curve, Koch snowflake, Harter-Heighway
dragon curve, T-Square, Menger sponge, are some examples of such fractals.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 23 of 35

Escape-time fractals — Fractals defined by a recurrence relation at each point in a space


(such as the complex plane). Examples of this type are the Mandelbrot set, the Burning
Ship fractal and the Lyapunov fractal.

Random fractals — Generated by stochastic rather than deterministic processes, for


example, fractal landscapes, Lévy flight and the Brownian tree. The latter yields so-called
mass- or dendritic fractals, for example, diffusion-limited aggregation or reaction-limited
aggregation clusters.

Classification of fractals

Fractals can also be classified according to their self-similarity. There are three types of
self-similarity found in fractals:

Exact self-similarity — This is the strongest type of self-similarity; the fractal appears
identical at different scales. Fractals defined by iterated function systems often display
exact self-similarity.
Quasi-self-similarity — This is a loose form of self-similarity; the fractal appears
approximately (but not exactly) identical at different scales. Quasi-self-similar fractals
contain small copies of the entire fractal in distorted and degenerate forms. Fractals
defined by recurrence relations are usually quasi-self-similar but not exactly self-similar.
Statistical self-similarity — This is the weakest type of self-similarity; the fractal has
numerical or statistical measures which are preserved across scales. Most reasonable
definitions of “fractal” trivially imply some form of statistical self-similarity. (Fractal
dimension itself is a numerical measure which is preserved across scales.) Random
fractals are examples of fractals which are statistically self-similar, but neither exactly
nor quasi-self-similar.
It should be noted that not all self-similar objects are fractals — e.g., the real line (a
straight Euclidean line) is exactly self-similar, but since its Hausdorff dimension and

Fractals in nature

A fractal fern computed using an Iterated function system

Approximate fractals are easily found in nature. These objects


display self-similar structure over an extended, but finite, scale
range. Examples include clouds, snow flakes, mountains, river
networks, and systems of blood vessels.

Trees and ferns are fractal in nature and can be modeled on a


computer using a recursive algorithm. This recursive nature is clear
in these examples — a branch from a tree or a frond from a fern is a
miniature replica of the whole: not identical, but similar in nature.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 24 of 35

The surface of a mountain can be modeled on a computer using a fractal:


Start with a triangle in 3D space and connect the central points of each side by line
segments, resulting in 4 triangles. The central points are then randomly moved up or
down, within a defined range. The procedure is repeated, decreasing at each iteration the
range by half. The recursive nature of the algorithm guarantees that the whole is
statistically similar to each detail.

High voltage breakdown Romanesco broccoli


A fractal is formed when pullingwithin a 4″ block of Fractal branching showing very fine
apart two glue-covered acrylic acrylic creates a fractal occurs on a microwave- natural fractals
sheets. Lichtenberg figure. irradiated DVD

Chaos theory

In mathematics and physics, chaos theory describes the behavior of certain nonlinear
dynamical systems that under certain conditions exhibit a phenomenon known as chaos.
Among the characteristics of chaotic systems, described below, is sensitivity to initial
conditions (popularly referred to as the butterfly effect). As a result of this sensitivity, the
behavior of systems that exhibit chaos appears to be random, even though the system is
deterministic in the sense that it is well defined and contains no random parameters.
Examples of such systems include the atmosphere, the solar system, plate tectonics,
turbulent fluids, economics, and population growth.

Systems that exhibit mathematical chaos are deterministic and thus orderly in some
sense; this technical use of the word chaos is at odds with common parlance, which
suggests complete disorder. (See the article on mythological chaos for a discussion of the
origin of the word in mythology, and other uses.) A related field of physics called
quantum chaos theory studies non-deterministic systems that follow the laws of quantum
mechanics.

For a dynamical system to be classified as chaotic, most scientists will agree that it must
have the following properties:

• it must be sensitive to initial conditions,


• it must be topologically mixing, and
• its periodic orbits must be dense.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 25 of 35

Sensitivity to initial conditions means that each point in such a system is arbitrarily
closely approximated by other points with significantly different future trajectories. Thus,
an arbitrarily small perturbation of the current trajectory may lead to significantly
different future behavior.

Sensitivity to initial conditions is popularly known as the "butterfly effect", suggesting


that the flapping of a butterfly's wings might create tiny changes in the atmosphere,
which could over time cause a tornado to occur. The flapping wing represents a small
change in the initial condition of the system, which causes a chain of events leading to
large-scale phenomena. Had the butterfly not flapped its wings, the trajectory of the
system might have been vastly different.

History

The first discoverer of chaos can plausibly be argued to be Jacques Hadamard, who in
1898 published an influential study of the chaotic motion of a free particle gliding
frictionlessly on a surface of constant negative curvature. In the system studied,
Hadamard's billiards, Hadamard was able to show that all trajectories are unstable, in that
all particle trajectories diverge exponentially from one-another, with positive Lyapunov
exponent. In the early 1900s, Henri Poincaré while studying the three-body problem,
found that there can be orbits which are nonperiodic, and yet not forever increasing nor
approaching a fixed point. Much of the early theory was developed almost entirely by
mathematicians, under the name of ergodic theory. Later studies, also on the topic of
nonlinear differential equations, were carried out by G.D. Birkhoff, A.N. Kolmogorov,
M.L. Cartwright, J.E. Littlewood, and Stephen Smale. Except for Smale, these studies
were all directly inspired by physics: the three-body problem in the case of Birkhoff,
turbulence and astronomical problems in the case of Kolmogorov, and radio engineering
in the case of Cartwright and Littlewood. Although chaotic planetary motion had not
been observed, experimentalists had encountered turbulence in fluid motion and
nonperiodic oscillation in radio circuits without the benefit of a theory to explain what
they were seeing.

Chaos theory progressed more rapidly after mid-century, when it first became evident for
some scientists that linear theory, the prevailing system theory at that time, simply could
not explain the observed behavior of certain experiments like that of the logistic map.
The main catalyst for the development of chaos theory was the electronic computer.
Much of the mathematics of chaos theory involves the repeated iteration of simple
mathematical formulas, which would be impractical to do by hand. Electronic computers
made these repeated calculations practical. One of the earliest electronic digital
computers, ENIAC, was used to run simple weather forecasting models.

An early pioneer of the theory was Edward Lorenz whose interest in chaos came about
accidentally through his work on weather prediction in 1961. Lorenz was using a basic
computer, a Royal McBee LGP-30, to run his weather simulation. He wanted to see a
sequence of data again and to save time he started the simulation in the middle of its

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 26 of 35

course. He was able to do this by entering a printout of the data


corresponding to conditions in the middle of his simulation which he had calculated last
time.

To his surprise the weather that the machine began to predict was completely different
from the weather calculated before. Lorenz tracked this down to the computer printout.
The printout rounded variables off to a 3-digit number, but the computer worked with 6-
digit numbers. This difference is tiny and the consensus at the time would have been that
it should have had practically no effect. However Lorenz had discovered that small
changes in initial conditions produced large changes in the long-term outcome.

Yoshisuke Ueda independently identified a chaotic phenomenon as such by using an


analog computer on November 27, 1961. The chaos exhibited by an analog computer is
truly a natural phenomenon, in contrast with those discovered by a digital computer.
Ueda's supervising professor, Hayashi, did not believe in chaos throughout his life, and
thus he prohibited Ueda from publishing his findings until 1970.

The term chaos as used in mathematics was coined by the applied mathematician James
A. Yorke.

The availability of cheaper, more powerful computers broadens the applicability of chaos
theory. Currently, chaos theory continues to be a very active area of research.

Chaos theory is applied in many scientific disciplines: mathematics, biology, computer


science, economics, engineering, philosophy, physics, politics, population dynamics,
psychology, robotics, etc.

Bifurcation theory

In mathematics, specifically in the study of dynamical systems, a bifurcation occurs


when a small smooth change made to the parameter values (the bifurcation parameters)
of a system causes a sudden 'qualitative' or topological change in the system's long-term
dynamical behaviour. Bifurcations can occur in continuous systems (described by ODEs,
DDEs or PDEs), and discrete systems (described by maps).

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 27 of 35

Extended Consciousness

Bell's theorem

Bell's theorem is the most famous legacy of the late John Bell. It is notable for showing
that the predictions of quantum mechanics (QM) differ from those of intuition. It is
simple and elegant, and touches upon fundamental philosophical issues that relate to
modern physics. In its simplest form, Bell's theorem states:

No physical theory of local hidden variables can ever reproduce all of the
predictions of quantum mechanics.

This theorem has even been called "the most profound in science" (Stapp, 1975). Bell's
seminal 1965 paper was entitled "On the Einstein Podolsky Rosen paradox". The Einstein
Podolsky Rosen paradox (EPR paradox) assumes local realism, the intuitive notion that
particle attributes have definite values independent of the act of observation and that
physical effects have a finite propagation speed. Bell showed that local realism leads to a
requirement for certain types of phenomena that are not present in quantum mechanics.
This requirement is called Bell's inequality.

Different authors subsequently derived similar inequalities, collectively termed Bell


inequalities, that also assume local realism. That is, they assume that each quantum-level
object has a well defined state that accounts for all its measurable properties and that
distant objects do not exchange information faster than the speed of light. These well
defined properties are often called hidden variables, the properties that Einstein posited
when he stated his famous objection to quantum mechanics: "[God] does not play dice."

The inequalities concern measurements made by observers (often called Alice and Bob)
on entangled pairs of particles that have interacted and then separated. Hidden variable
assumptions limit the correlation of subsequent measurements of the particles. Bell
discovered that under quantum mechanics this correlation limit may be violated.
Quantum mechanics lacks local hidden variables associated with individual particles, and
so the inequalities do not apply to it. Instead, it predicts correlation due to quantum
entanglement of the particles, allowing their state to be well defined only after a
measurement is made on either particle. That restriction agrees with the Heisenberg
uncertainty principle, one of the most fundamental concepts in quantum mechanics.

Per Bell's theorem, either quantum mechanics or local realism is wrong. Experiments
were needed to determine which is correct, but it took many years and many
improvements in technology to perform them.

Bell test experiments to date overwhelmingly show that the inequalities of Bell's theorem
are violated. This provides empirical evidence against local realism and demonstrates that
some of the "spooky action at a distance" suggested by the famous Einstein Podolsky
Rosen (EPR) thought experiment do in fact occur. They are also taken as positive
evidence in favor of QM. The principle of special relativity is saved by the no-

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 28 of 35

communication theorem, which proves that the observers cannot use the
inequality violations to communicate information to each other faster than the speed of
light.

John Bell's papers examined both John von Neumann's 1932 proof of the incompatibility
of hidden variables with QM and Albert Einstein and his colleagues' seminal 1935 paper
on the subject.

Importance of the theorem

After EPR, quantum mechanics was left in the unsatisfactory position that it was either
incomplete in the sense that it failed to account for some elements of physical reality, or
it violated the principle of finite propagation speed of physical effects. In the EPR
thought experiment, two observers, now commonly referred to as Alice and Bob, perform
independent measurements of spin on a pair of electrons, prepared at a source in a special
state called a spin singlet state. It was a conclusion of EPR that once Alice measured spin
in one direction (e.g. on the x axis), Bob's measurement in that direction was determined
with certainty, whereas immediately before Alice's measurement, Bob's outcome was
only statistically determined. Thus, either the spin in each direction is not an element of
physical reality or the effects travel from Alice to Bob instantly.

In QM predictions were formulated in terms of probabilities, for example, the probability


that an electron might be detected in a particular region of space, or the probability that it
would have spin up or down. However, there still remained the idea that the electron had
a definite position and spin, and that QM's failing was its inability to predict those values
precisely. The possibility remained that some yet unknown, but more powerful theory,
such as a hidden variable theory, might be able to predict these quantities exactly, while
at the same time also being in complete agreement with the probabilistic answers given
by QM. If a hidden variables theory were correct, the hidden variables were not described
by QM and thus QM would be an incomplete theory.

The desire for a local realist theory was based on two ideas: first, that objects have a
definite state that determines the values of all other measurable properties such as
position and momentum and second, that (as a result of special relativity) effects of local
actions such as measurements cannot travel faster than the speed of light. In the
formalization of local realism used by Bell, the predictions of a theory result from the
application of classical probability theory to an underlying parameter space. By a simple
(but clever) argument based on classical probability he then showed that correlations
between measurements are bounded in a way that is violated by QM.

Bell's theorem seemed to seal the fate of those that had local realist hopes for QM.

Bell's thought experiment

Bell considered a setup in which two observers, Alice and Bob, perform independent
measurements on a system S prepared in some fixed state. Each observer has a detector

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 29 of 35

with which to make measurements. On each trial, Alice and Bob can
independently choose between various detector settings. Alice can choose a detector
setting a to obtain a measurement A(a) and Bob can choose a detector setting b to
measure B(b). After repeated trials Alice and Bob collect statistics on their measurements
and correlate the results.

There are two key assumptions in Bell's analysis: (1) each measurement reveals an
objective physical property of the system (2) a measurement taken by one observer has
no effect on the measurement taken by the other.

In the language of probability theory, repeated measurements of system properties can be


regarded as repeated sampling of random variables. One might expect measurements by
Alice and Bob to be somehow correlated with each other: the random variables are
assumed not to be independent, but linked in some way. Nonetheless, there is a limit to
the amount of correlation one might expect to see. This is what the Bell inequality
expresses.

Notable quotes

Heinz Pagels, in The Cosmic Code, writes:

Some recent popularizers of Bell's work when confronted with [Bell's inequality]
have gone on to claim that telepathy is verified or the mystical notion that all parts
of the universe are instantaneously interconnected is vindicated. Others assert that
this implies communication faster than the speed of light. That is rubbish; the
quantum theory and Bell's inequality imply nothing of this kind. Individuals who
make such claims have substituted a wish-fulfilling fantasy for understanding. If
we closely examine Bell's experiment we will see a bit of sleight of hand by the
God that plays dice which rules out actual nonlocal influences. Just as we think
we have captured a really weird beast--like acausal influences--it slips out of our
grasp. The slippery property of quantum reality is again manifested."

Bell test experiments

Main article: Bell test experiments.

Bell's inequalities are tested by "coincidence counts" from a Bell test experiment such as
the optical one shown in the diagram. Pairs of particles are emitted as a result of a
quantum process, analysed with respect to some key property such as polarisation
direction, then detected. The setting (orientations) of the analysers are selected by the
experimenter.

Bell test experiments to date overwhelmingly suggest that Bell's inequality is violated.
Indeed, a table of Bell test experiments performed prior to 1986 is given in 4.5 of
(Redhead, 1987). Of the thirteen experiments listed, only two reached results

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 30 of 35

contradictory to quantum mechanics; moreover, according to the same


source, when the experiments were repeated, "the discrepancies with QM could not be
reproduced".

Scheme of a "two-channel" Bell test


The source S produces pairs of "photons", sent in opposite directions. Each photon
encounters a two-channel polariser whose orientation (a or b) can be set by the
experimenter. Emerging signals from each channel are detected and coincidences of four
types (++, --, +- and -+) counted by the coincidence monitor.

Nevertheless, the issue is not conclusively settled. According to Shimony's 2004 Stanford
Encyclopedia overview article

"Most of the dozens of experiments performed so far have favored Quantum


Mechanics, but not decisively because of the 'detection loopholes' or the
'communication loophole.' The latter has been nearly decisively blocked by a
recent experiment and there is a good prospect for blocking the former."

Implications of violation of Bell's inequality

The phenomenon of quantum entanglement that is implied by violation of Bell's


inequality is just one element of quantum physics which cannot be represented by any
classical picture of physics; other non-classical elements are complementarity and
wavefunction collapse. The problem of interpretation of quantum mechanics is intended
to provide a satisfactory picture of these non-classical elements of quantum physics.

Some advocates of the hidden variables idea prefer to accept the opinion that experiments
have ruled out local hidden variables. They are ready to give up locality (and probably
also causality), explaining the violation of Bell's inequality by means of a "non-local"
hidden variable theory, in which the particles exchange information about their states.
This is the basis of the Bohm interpretation of quantum mechanics. It is, however,

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 31 of 35

considered by most to be unconvincing, requiring, for example, that all


particles in the universe be able to instantaneously exchange information with all others.

Finally, one subtle assumption of the Bell inequalities is counterfactual definiteness. The
derivation refers to several objective properties that cannot all be measured for any given
particle, since the act of taking the measurement changes the state. Under local realism
the difficulty is readily overcome, so long as we can assume that the source is stable,
producing the same statistical distribution of states for all the subexperiments. If this
assumption is felt to be unjustifiable, though, one can argue that Bell's inequality is
unproven. In the Everett many-worlds interpretation, the assumption of counterfactual
definiteness is abandoned, this interpretation assuming that the universe branches into
many different observers, each of whom measures a different observation. Hence many
worlds can adhere to both the properties of philosophical realism and the principle of
locality and not violate Bell's conditions -- the only interpretation that can do this.

Locality (Local Universe)

This article is about the principle of locality in physics.

In physics, the principle of locality is that distant objects cannot have direct influence on
one another: an object is influenced directly only by its immediate surroundings. This
was stated as follows by Albert Einstein in his article "Quantum Mechanics and Reality"
("Quanten-Mechanik und Wirklichkeit", Dialectica 2:320-324, 1948):

“The following idea characterises the relative independence of objects far apart in space
(A and B): external influence on A has no direct influence on B; this is known as the
Principle of Local Action, which is used consistently only in field theory. If this axiom
were to be completely abolished, the idea of the existence of quasienclosed systems, and
thereby the postulation of laws which can be checked empirically in the accepted sense,
would become impossible.”

Local realism is the combination of the principle of locality with the "realistic"
assumption that all objects must objectively have their properties already before these
properties are observed. Einstein liked to say that the Moon is "out there" even when no
one is observing it.

Local realism is a significant feature of classical mechanics, general relativity and


Maxwell's theory, but quantum mechanics largely rejects this principle due the presence
of distant quantum entanglements, most clearly demonstrated by the EPR paradox and
quantified by Bell's inequalities. Every theory that, like quantum mechanics, is
compatible with violations of Bell's inequalities must abandon either local realism or
counterfactual definiteness. (The vast majority of physicists believe that experiments
have demonstrated Bell's violations, but some local realists dispute the claim, in view of
the recognised loopholes in the tests.) Different interpretations of quantum mechanics
reject different parts of local realism and/or counterfactual definiteness.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 32 of 35

Nonlocality

A Physical theory is said to exhibit strict nonlocality if in that theory it is not possible to
treat widely separated systems as independent. The term is most often reserved, however,
just for interaction supposed to occure outside the past light cone. Nonlocality does not
necessarily imply a lack of causality. For instance, Newtonian gravitation is nonlocal
because it involves instantaneous action-at-a-distance but Newtonian mechanics is
certainly causal. Effects that appear nonlocal in Quantum Mechanics, some physicists
say, actually obey locality; in these cases, the nonlocal interaction affects correlations that
are considered within the Copenhagen interpretation of quantum mechanics to pertain to
states of matter that result from the wave collapse upon measurement of irreal states
comprised of the sum of mutually exclsive possibilities, e.g., the singlet state. Einstein
criticised this interpretation of quantum mechanics on the grounds that these effects
employed "spooky instantaneous action at a distance". This issue is very closely related
to Bell's theorem and the EPR paradox. Quantum field theory, on the other hand, which is
the relativistic generalization of quantum mechanics, contains mathematical features that
assure locality, so that nonrelativistic quantum mechanics should be local as well. Thus,
the EPR paradox.

In most of the conventional interpretations, such as the version of the Copenhagen


interpretation and the interpretation based on Consistent Histories, where the
wavefunction is not assumed to have a direct physical interpretation or reality it is
realism that is rejected. The actual definite properties of a physical system "do not exist"
prior to the measurement and the wavefunction has a restricted interpretation as nothing
more than a mathematical tool used to calculate the probabilities of experimental
outcomes, in agreement with positivism in philosophy as the only topic that science
should discuss.

In the version of the Copenhagen interpretation where the wavefunction is assumed to


have a physical interpretation or reality (the nature of which is unspecified), the principle
of locality is violated during the measurement process via wavefunction collapse. This is
a non-local process because Born's Rule, when applied to the system's wave function,
yields a probability density for all regions of space and time. Upon measurement of the
physical system, the probability density vanishes everywhere instantaneously, except
where (and when) the measured entity is found to exist. This "vanishing" would be a real
physical process, and clearly non-local (faster-than-lightspeed), if the wave function is
considered physically real and the probability density converged to zero at arbitrarily far
distances during the finite time required for the measurement process.

The Bohm interpretation always wants to preserve realism, and it needs to violate the
principle of locality to achieve the required correlations.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 33 of 35

In the many-worlds interpretation realism and locality are retained but


counterfactual definiteness is rejected by the extension of the notion of reality to allow
the existence of parallel universes.

Because the differences between the different interpretations are mostly philosophical
ones (except for the Bohm and many-worlds interpretations), the physicists usually use
the language in which the important statements are independent of the interpretation we
choose. In this framework, only the measurable action at a distance - a superluminal
propagation of real, physical information - would be usually considered to be a violation
of locality by the physicists. Such phenomena have never been seen, and they are not
predicted by the current theories (with the possible exception of the Bohm theory).

Locality is one of the axioms of relativistic quantum field theory, as required for
causality. The formalization of locality in this case is as follows: if we have two
observables, each localized within two distinct spacetime regions which happen to be at a
spacelike separation from each other, the observables must commute. This interpretation
of the word "locality" is closely related to the relativistic version of in physics. In physics
a solution is local if the underlying equations are either Lorentz invariant or, more
generally, generally covariant or locally Lorentz invariant.

Tao

Tao or Dao refers to a Chinese character that was of pivotal meaning in ancient Chinese
philosophy and religion. Its most generic meaning, it refers to the "head path," and is
generally translated into English as "The Way".

The semantics of vary widely depending on the context, and may variously refer to a
concept of religion, morality, duty, knowledge, rationality, ultimate truth, path, or taste.
The CEDICT allows several different definition words for , as it varies in translation:

direction, way, method, road, path, principle, truth, reason, skill, method, Tao (of
Taoism), a measure word, to say, to speak, and to talk.

Tao is central to Taoism, but Confucianism also uses it to refer to "The Way," or the
"noble way" of personal conduct in life. The philosophic and religious use of the
character can be analyzed in two main segments: one meaning is "doctrine" or
"discourse"; every school owns and defends a specific Tao or discourse about doctrine. In
the other meaning, there is the 'Great Tao', that is the source of and guiding principle
behind all the processes of the universe. Beyond being and non-being, prior to space and
time, Tao is the intelligent ordering principle behind the unceasing flow of change in the
natural world. In this sense Tao gains great cosmological and metaphysical significance
comparable to the theistic concept of God; the Greek concept of the logos; or the Dharma
in Indian religions.

The nature and meaning of the Tao received its first full exposition in the Tao Te Ching
of Laozi, a work which along with those of Confucius and Mencius would have a far-

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 34 of 35

reaching effect on the intellectual, moral and religious life of the Chinese
people. Although a book of practical wisdom in many ways, its profoundly metaphysical
character was unique among the prevailing forms of thought in China at that time. The
religion and philosophy based on the teaching of Laozi and his successor Zhuangzi is
known in English as "Taoism." Even though the Tao is often said to be undefinable and
unexplainable with words (even Chinese ones), the present article focuses on the Tao of
Taoism

Some characteristics of Tao

There is a flow in the universe, and it is called dao. Dao flows slowly, however; it is
never stagnant and is incredibly powerful and keeps things in the universe balanced and
in order. It manifests itself through change of seasons, cycle of life, shifts of power, time,
and so forth. Dao has a strong and deep connection with cosmology and the natural
world, as the most well-known Daoist philosophers Laozi and Zhuangzi agreed. Dao is
the law of Nature. When you follow dao, you become one with it. And it is best to also
understand chi, because chi and dao go hand in hand. Chi is a Chinese term that is
translated as breath, vapour, and energy. Because chi is the energy that circulates the
universe, it can be said that dao is ultimately a flow of chi. Being one with dao brings
best outcomes, because that way things fall into place that they are meant to be.

The concept of Tao is based upon the understanding that the only constant in the universe
is change, (ie. I Ching, the "Book of Changes") and that we must understand and be in
harmony with this change. The change is a constant flow from non-being into being,
potential into actual, yin into yang, female into male. The symbol of the Tao, called the
Taijitu, is the yin yang confluently flowing into itself in a circle.

The Tao is the main theme discussed in the Tao Te Ching, an ancient Chinese scripture
attributed to Lao Tsu. This book does not specifically define what the Tao is; it affirms
that in the first sentence, "The Tao that can be told of is not an Unvarying Tao" (tr.
Waley, modified). Instead, it points to some characteristics of what could be understood
as being the Tao. Below are some excerpts from the book.

• Tao as the origin of things: “Tao begets one; One begets two; Two begets three;
Three begets the myriad creatures.” (TTC 42, tr. Lau, modified)
• Tao as an inexhaustible nothingness: “The Way is like an empty vessel / That yet
may be drawn from / Without ever needing to be filled.” (TTC 4, tr. Waley)
• Tao is omnipotent and infallible: “What Tao plants cannot be plucked, what Tao
clasps, cannot slip.” (TTC 54, tr. Waley)

In the Yi Jing, a sentence closely relates Tao to Yin-Yang or Taiji, asserting that "one
(phase of) Yin, one (phase of) Yang, is what is called the Tao". Being thus placed at the
conjunction of Yin and Yang alternance, Tao can be understood as the continuity
principle that underlies the constant evolution of the world.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc


Page 35 of 35

Most debates between proponents of one of the Hundred Schools of


Thought could be summarized in the simple question: who is closer to the Tao, or, in
other words, whose "Tao" is the most powerful? As used in modern spoken and written
Chinese, Tao has a wide scope of usage and meaning.

Subspace- Hyperspace (science fiction)

In science fiction, hyperspace is any region of space co-existing with our own universe
(in some cases displaced in an extra spatial dimension) which may be entered using some
sort of energy field or space-altering device. While hyperspace is in some way anchored
to the normal universe, its properties are not the same as normal space, so traveling in
hyperspace is largely inequivalent to traveling in normal space. This makes for a handy
explanation of faster than light (FTL) travel: while the shortest distance between two
points in normal space is a straight line, hyperspace allows those points to be closer
together, or a curved line in normal space to be straight, etc. Hyperspace is the most
common device used for explaining FTL in a science fiction story where FTL is
necessary for interstellar travel or intergalactic travel. Spacecraft able to use hyperspace
for FTL travel are said to have hyperdrive.

Subspace (Star Trek)

Subspace is a term used in many different science fiction media to explain many different
concepts. Most often, subspace is used as a means to justify faster-than-light transit, in
the form of interstellar travel or the transmission of information. Subspace is loosely
associated at times with certain ideas expressed in string theory, which state that the
Universe is not limited to four dimensions; there may be upwards of ten which we do not
readily perceive but affect us summarily. By exploiting these higher dimensions, thus
circumventing the limitations of the four we are most accustomed to, FTL speeds are
imitated (or potentially achieved). Subspace is also comparable to hyperspace (science
fiction); the two ideas are often interchangeable and applied in similar fashion.

In most Star Trek series, subspace communications are a means to (usually) establish
instantaneous contact with people and places that are light-years away. The physics of
Star Trek describe infinite speed (expressed as Warp 10) as an impossibility; as such even
subspace communications which putatively travel at speeds over Warp 9.9 may take
hours or weeks to reach certain destinations. Once the connection is made, however,
communication between the two points often becomes instantaneous.

E:\IMUNE\Biofeedback Exams\Support materials\Quantum Physics Basics.doc

Das könnte Ihnen auch gefallen