Book Review: The Grand Design – by Stephen Hawking and Leonard
Mlodinow (Bantam Books, 2010)
This book was an absolute delight to read and I never thought
I would say that about a physics book. I read Hawking’s A Brief History of Time a few years back and found it rather unsatisfying.
This one, however, was fascinating, fairly easy to follow, well-illustrated,
and was even peppered with good natured nerdy humor and comics. It also delves
into the philosophical aspects of physics and the nature of reality. The book doesn’t
always seem to make sense but it does offer a good non-technical explanation of
the current state of physics along with the perhaps more important qualitative
aspects of what it all means.
The authors point out that science increasingly informs and
influences philosophy as new discoveries are made. Quantum physics shows that
reality in the subatomic quantum level is fundamentally different and yet quite
predictable. Quantum physics and classical physics are based on very different
conceptions of physical reality. Richard Feynman’s idea of multiple histories
is one of a number of quantum physics ideas that seem to defy common sense. The
authors here adopt an approach they call – model-dependent
realism. It is based on the idea that we humans interpret sensory inputs by
making a model of the world to explain reality. It may be the case that more
than one model could explain reality. The history of science is one in which
better and better theories, or models, have come about to explain reality. The
authors think we now have a model that is a candidate for the “ultimate theory
of everything,” which they call M-theory.
They describe M-theory as a family of theories that is more like a map.
M-theory predicts that many universes were created out of nothing and in each
there are multiple histories. Our very existence selects only those universe(s)
that are compatible with it.
A short history of science is given from Thales and other
pre-Socratic philosophers through classical Greece. Laws and relationships were
then first tabulated with the string length harmonics attributed to Pythagoras
and the principles of Archimedes regarding buoyancy. Anaximander, circa 610-546
BC, reasoned that humans ‘evolved’ from other animals since the first human
infant would have been helpless and not able to survive. Empedocles discovered properties
of air and fluids. Democritus divided matter into smaller pieces but postulated
a limit to the division. Have we found it? Maybe. Maybe not. His ideas of the fundamental
units, or atoms, crashing into one another are a precursor to the law of
inertia. Aristarchus (circa 310-230 BC) reasoned that the sun was much bigger
than the earth by geometrically measuring the shadow cast by the earth on the
moon during eclipses. To him is also attributed the first heliocentric model
and the suspicion that stars were distant suns. Based on these ideas and
perhaps others he was said to have thought that humans were not so special in
the universe, something modern science seems to suggest often. The ideas above
were often not even popular in their time as there were many rival ideas. Was
this idea that we are not special new? Probably. Of course many ancient
explanations about nature were way off the mark and entwined with mythology.
Aristotle was more influenced by direct observation but he still incorporated
much nonsense. The authors go on through Kepler and his scientific (and
religious) explanations of the movements of the heavenly bodies. The Church was
still very powerful in Europe when in 1277 the pope noted that the idea that
nature followed laws was heretical. Indians and Arabs developed mathematics.
Kepler and Galileo are credited with the resurgence of the idea of laws of
nature and Descartes believed that all physical phenomena could be explained by
collisions of moving masses. His three laws became the precursor to Newton’s
laws of motion. All of these scientists also sought to reconcile their
observations with God and religious dogma. After all, it was God who made the
laws of nature, right?
“Today most scientists would say that a law of nature is a
rule that is based upon an observed regularity and provides predictions that go
beyond the immediate situations upon which it is based.”
Today, laws of nature are typically phrased in mathematics
and generally hold universally or at least under a stipulated set of
conditions. Newton’s laws hold until the objects in question approach the speed
of light – then they must be modified. An unexplainable exception to a natural
law might be considered a miracle or simply an unknown. Pierre-Simon, marquis
de Laplace (1749-1827) is credited with postulating scientific determinism: “Given
the state of the universe at one time, a complex set of laws fully determines
both the future and the past.” Others believe humans have free will so that determinism
is only a feature of the non-human universe. The authors seem to favor the view
that we are basically biological machines subject wholly to the laws of nature
and that free will is basically an illusion. Even if we are governed by
determinism there is such complexity that outcomes are difficult and often
impossible to predict. Science and physics relies on ‘effective theories’ that
successfully predict outcomes of experiments which only include small bits of
the total of a system. Since we cannot document and describe all the details
that lead to our behaviors or even predict what those behaviors will be we rely
on the effective theory of free will, say the authors. They also note that most
scientists would say that the laws of nature are a “mathematical reflection of
an external reality that exists independent of the observer that sees it.” That
is, the authors say, if an objective reality even really exists.
Next, reality is explored. Ptolemy’s earth-centered model of
physical reality was the most accepted because it seemed sensible and correct
and could be partially explained scientifically. This model ruled in Western
belief for 1400 years. Copernicus revealed a heliocentric model 1700 years
after Aristarchus. The authors here note that in some sense both models could
be considered real although the heliocentric model has much more evidence and
mathematical support. They explore the idea that we are living in some sort of
simulation, like in the Matrix movies. They note that we, like beings in a
simulated world, cannot see our universe from outside of it. Thus, they claim –
all realities are ‘model-dependent.’ “There
is no picture- or theory-independent concept of reality.” Thus, they adopt
the idea of ‘model-dependent realism.’ Realism refers to a situation where all
observers studying a system will measure the same properties of it. Quantum
physics makes any realism a difficult position to defend since on the quantum
level the observer alters the observed. One theory – the holographic principle –
suggests that our 4-dimensional universe is a ‘shadow’ on the boundary of a
larger 5-dimensional space-time. The analogy is that we are like a goldfish in
a curved bowl experiencing a distorted view of the reality outside of it. There
are also ‘anti-realists’ who say that if it can’t be observed it doesn’t exist.
Such was said for atoms when they were proposed but the evidence for their
existence has grown vastly since they were proposed. Thus, since the evidence
for their existence is so strong we are compelled to act as if they exist
without actually seeing them (an idea from philosopher David Hume). That is the
situation with the subatomic world. In model-dependent realism the arguments
for whether a model is real or not are put aside and only the notion whether
the model agrees with observation is considered. Thus, two or more models could
well be correct in agreeing with observation. All knowledge really has a
subjective component since we observe with our senses which are processed by
our brain which builds models in the form of mental pictures. Regarding the
subatomic world we build models based on mathematical observations and what we
do know about larger units of matter through observation. We can’t see
individual particles or electrons but we can see the effects they produce. Quarks
are a model to explain the properties of protons and neutrons in the nucleus of
an atom but we will never see one because free quarks theoretically cannot
exist in nature. On this basis, whether quarks actually exist or not, has been
debated since they are theoretically forever unobservable. However, the quark
model has led to many correct predictions of experimental results. Thus the
quark model agrees with our observations of how subnuclear particles behave.
There are models related to the beginning of the universe as
the Big Bang and ones that go back to before the Big Bang. The Big Bang
beginning model is favored simply because it has observable consequences for
the present and the latter does not (as of yet anyway). The authors note that a
model is a good fit if it:
1.
“Is elegant
2.
Contains few arbitrary or adjustable elements
3.
Agrees with and explains all existing
observations
4.
Makes detailed predictions about future
observations that can disprove or falsify the model if they are not borne out.”
The idea of elegance is related to having few adjustable
elements since a theory with many ‘fudge factors’ is not very elegant. According
to many the “standard model” itself is inelegant, having too many adjustable
parameters fixed to match observations, although it still has successfully predicted
results and the existence of new particles. When too many factors are attempted
to try and rescue a theory it is deemed inelegant. Sometimes a new model may be
required. An example is given of the old model of a static universe and Edwin
Hubble’s research of the light emitted by galaxies which suggested that the
universe is actually expanding and not static. Hubble had expected to observe a
static universe with as many galaxies moving away from us as toward us but he observed
nearly all galaxies moving away from us and the further away from us the faster
they moved away. It is not just the transformation from Newtonian physics to
Einsteinian and quantum physics that represents a paradigm shift but all new
theories replace old ones in similar fashion with variable levels of
fundamental shifts. Einstein’s photoelectric effect showed that light behaves
as both a wave and a particle depending on how it is observed.
The authors note that thus far there is no single mathematical
model or theory that can describe every aspect of the universe. The network of
theories known as M-theory can describe phenomena within a certain range. While
this does not satisfy the search for a single unified theory it is acceptable
in terms of model-dependent realism.
Next the “alternative histories” quantum theory is explored
where the idea of quantum superposition suggests that every possible version of
the universe exists simultaneously. Newtonian laws are seen as an approximation
of the way macroscopic objects composed of quantum components behave. The wave/particle
duality of light and the uncertainty principle are two important aspects of
quantum theory. The wave/particle duality was discovered in the famous
double-slit experiments. The uncertainty principle was resolved mathematically
by Werner Heisenberg. Basically it says the more precisely you measure speed of
a particle the less precisely can position of the particle be measured, and
vice versa. Due to the extremely small number that describes the uncertainty principle,
Planck’s constant, it only noticeably affects particles on the subatomic level.
In our big Newtonian world speed and position can be measured just fine with
the uncertainty principle too small for us to notice.
“Quantum physics might seem to undermine the idea that nature is governed by laws, but that is not the case. Instead it leads us to accept a new form of determinism: Given the state of a system at some time, the laws of nature determine the probabilities of various futures and pasts rather than determining the future and past with certainty.”
Unlike everyday probabilities, quantum probabilities reflect
a basic randomness in nature. Feynman
noted that no one really understands quantum physics. Even so it agrees
with observation and has been well tested. Feynman developed the alternative
histories formulation of quantum physics. His mathematical expression - the
Feynman sum over histories – assumes that a particle traveling from point A to
point B takes all possible paths simultaneously. The shortest path between two
points is a straight line in the classical physics model. In the sum over
histories models it is as well since the interference patterns enhance one
another in the more straight line paths and cancel one another out in the
diverging paths. Thus the reality of classical physics “seems” to be correct,
especially with large objects.
‘Observation alters the observed’ is the gist of the
uncertainty principle. Knowing one thing in the quantum world involves
unknowing others. The concept of the past in quantum theory is that it is
indefinite, based on possibilities, and that it has no single history. Thus,
present observations affect the past. Physicist John Wheeler demonstrated this
in his “delayed-choice” version of the double-slit experiment.
In a discussion of the laws of nature leading to a theory of
everything the authors go through the development of theories concerning the
four fundamental forces: 1) the gravitation force; 2) the electromagnetic
force; 3) the strong nuclear force; and 4) the weak nuclear force. Especially
explained is the electromagnetic force which is the characteristic also of
light on the EM spectrum. Einstein’s conclusion that the speed of light appears
the same to all uniformly moving observers required a new interpretation of
space and time. Thus it was discovered that space and time are intertwined and
the new 4th dimension, ‘space-time’ was born. That space-time is
curved became the basis of Einstein’s general theory of relativity which is a
major upgrade to Newton’s theory of gravity. The curvature is non-Euclidian and
resembles the shorter distances traveling by air from two points on the Earth
as a ‘great circle.’ Thus objects move on ‘geodesics’ in a similar fashion
rather than along straight lines since space-time itself is curved. The authors
note that both Maxwell’s theory of electromagnetism and Einstein’s theory of
gravity are still classical theories, or models in which the universe is
assumed to have a single history. These theories work in our everyday world –
Einstein’s general theory of relativity is used to keep GPSs and aircraft on
track to account for space-time based curvature. However, in order to study the
subatomic world we need quantum versions of these theories which are not based
on single histories but alternate ones of varying probability. Feynman and
others developed the first one base on electromagnetism called quantum
electrodynamics, or QED. This involves interactions of particles called bosons
(particles of light, or photons, are an example of a boson) and fermions (electrons
and quarks are examples of fermions). Feynman developed a graphical way of
representing the sum over histories, now known as Feynman diagrams. However,
there is a problem with this approach as the Feynman diagrams would give an
infinite mass and charge to electrons which is not the case when measured. To
adjust for this there is a process called renormalization where ‘fudge factors’
are used which can be considered mathematically dubious. Renormalization is an
essential ingredient in QED and QED has been successfully used for subatomic prediction.
The authors note that the division into the four fundamental
forces may be artificial and that physicists have long sought to unify the four
classes into a single law, or theory of everything. The electromagnetic and
weak nuclear forces have been unified into the electro-weak force which does
not require renormalization and has successfully predicted particles verified
by CERN. The strong force can be renormalized on its own in a theory called
quantum chromodynamics, or QCD. Here the quarks that makes up protons,
neutrons, and other particles have a property called ‘color.’ QCD also makes
use of a property called asymptotic freedom whereby the strong forces between
quarks are small when they are close together and large when they are far apart
– as if they were joined by rubber bands. Grand unified theories (GUTs) where
developed to unify the electromagnetic, weak, and strong forces but were struck
a hard blow in 2009 when it was apparently revealed experimentally that proton decay
is greater than 10 to the 34th power. In the earlier adopted ‘standard
model’ the electro-weak forces are unified but the strong force as in QCD
theory is considered separate. The standard model is yet unsatisfactory because
in addition to not unifying the strong and electro-weak forces it also fails to
unify the gravitational force. Integrating the gravitational force is more
difficult due to the uncertainty principle. The uncertainty principle predicts
that space cannot be empty, only at a state of minimum energy called the vacuum
which is subject to what are called vacuum fluctuations, or ‘quantum jitters.’
Now I am confused! While renormalization can remove infinities it cannot adjust
the infinity parameters of gravity theory. The theory of supergravity, based on
supersymmetry could be an alternative explanation that relies on force and
matter being two facets of the same thing. That means that each matter particle
such as a quark has a corresponding partner particle that is a force particle
such as a photon. Although many physicists think that supergravity can unify
gravity with the other forces it is near impossible to verify. The partner
particles have not even been observed.
The idea of supersymettry developed from early formulations
of string theory. The extra dimensions predicted by string theory are
considered to be so ‘small’ as to be undetectable, analogous to a tiny tiny
straw that is essentially a straight line to observation as the curved surface
is too small to have visual significance. Different string theories which
showed different ways of curling up the extra dimensions as well as
supergravity are suspected to be different approximations of a more fundamental
theory – called M-theory. It is unknown whether M-theory exists as a single approximation
or a network of theories that all agree with observation. M-theory predicts 11
dimensions (10 of space and one of time) and allows for different universes
with different apparent laws (whatever that really means!). The 4 dimensions
are the ones that are most applicable to us with the other 7 curled up so much
as to be rendered insignificant.
Hubble’s evidence for an expanding universe of course
strongly implied that the (current) universe had a beginning, known as the big
bang. The famous analogy is that it is as the surface of a balloon is
expanding. Due to gravitational forces everything within the expanding universe
does not expand but keeps its size. Evidence for the big bang includes cosmic
microwave background radiation (CMBR) that hints at a hot early universe and
helium abundance. At the beginning there was a singularity, where the
temperature, density, and curvature of the universe were all infinite. General
relativity theory does not work for the very early period of the universe’s existence.
The early expansion of the universe, called inflation, had to move way faster
than the speed of light. The idea is that if you go far back enough in time the
universe was small – atomic scale small – so that the relativity theory that
works in the macroscopic world breaks down and the microscopic world is
governed by quantum theory. Thus “creation” or the big bang had to be a quantum
event. In that early quantum-sized universe there were 4 dimensions of space
and none (yet?!) of time. Thus, time behaved as another dimension of space due
to warpage. This somewhat resolves the problem of whether time had a beginning
similar to the round earth resolving there being edges to the seemingly flat
earth. This “no boundary condition” of space-time at the beginning of the
universe sort of resolves the idea of whether the universe had a beginning or
not, they say – but I don’t fully understand it. It also removes the idea that
the universe had to be set in motion by an entity or force, say God. A
uniformly expanding universe will have the greatest probability due to its sum
over histories but other universes in the multiverse would be more non-uniform/irregular/asymetrical.
Our slightly non-uniform universe (as inferred by slight temp variations in the
CMBR) is lucky for us, they say, since it allows for heterogeneity of matter.
“We are the product of quantum fluctuations in the very early
universe.”
The sum of alternative histories of the universe (which
themselves are governed by probabilities) leads to the appearance of being
dominated by a single history that we tend to take as “the” single history.
“We seem to be at a critical point in the history of
science, in which we must alter our conception of goals and of what makes a
physical theory acceptable. It appears that the fundamental numbers, and even
the form, of the apparent laws of nature are not demanded by logic or physical
principle. The parameters are free to take on many values and the laws to take
on any form that leads to a self-consistent mathematical theory, and they do
take on different values and different forms in different universes. That may
not satisfy our human desire to be special or to discover a neat package to
contain all the laws of physics, but it does seem to be the way of nature.”
A universe among other universes in which beings like us
exist appears to be extremely unlikely and the rarity and special fine-tuning suggests
that we could be a miracle, or more likely an apparent miracle.
So, next is examined the so-called Goldilocks Principle –
the notion that our universe is oddly “just right” for life in several ways:
orbital eccentricity, axial tilt, the sun’s mass and our distance from it, our
single sun, etc. These coincidences seem suspicious but we now have discovered
that there are many other similarly favorable locations in the universe where
life could theoretically develop. We are bound to find that the environment in
which we live is required to support life. That itself leads to a principle:
“Our very existence imposes rules determining from where and
at what time it is possible for us to observe the universe.”
That is known as the ‘weak anthropic principle.’ It gives
ranges where life is likely. It is acceptable to most scientists.
The more controversial ‘strong anthropic principle’ “suggests
that the fact that we exist imposes constraints not just on our environment but also on the possible form and content of the laws of nature
themselves. The idea arose because it is not only the peculiar characteristics
of our solar system that seem oddly conducive to the development of human life
but also the characteristics of our entire universe, and that is much more
difficult to explain.”
In order to make us possible, heavier elements like carbon
had to form inside the heat of stars and then some explode in supernovas
dispersing the heavier elements throughout space. The early universe (the first
200 seconds) was mostly hydrogen with some helium and a smaller amount of
lithium. Heavy elements came much later. Carbon is created by the ‘triple alpha
process’ from the nucleus of helium isotopes, the ‘alpha particle.’ There is a
unique quantum state of the carbon isotope formed, known as carbon resonance,
which vastly increases the rate of the nuclear reaction. This process requires
that the fundamental forces of nature (the so-called four forces) be nearly
exactly what they are – a case of serendipity? Fundamental constants and masses
of particles also have to be in an extremely narrow range to develop life so
these are further supports to the strong anthropic principle.
“The laws of nature form a system that is extremely
fine-tuned, and very little in physical law can be altered without destroying
the possibility of the development of life as we know it. Were it not for a
series of startling coincidences in the precise details of physical law, it
seems, humans and similar life-forms would never have come into being.”
So it seems perhaps that the universe (or at least our
universe) is fine-tuned so that an observer would eventually discover its laws.
The most convincing evidence for the strong anthropic principle, the authors
note, is Einstein’s cosmological constant. Einstein called including it in his
theory the greatest blunder of his life. However, in 1998 it was resurrected
after studying distant supernovas. Its precision is uncannily precise – it is
related to repulsive forces and if even slightly different the universe would
have blown apart before the formation of galaxies. This strong anthropic
principle has led some back to ideas of God and intelligent design. However,
modern physics renders the remarkable unremarkable through the idea of multiple
universes – the multiverse concept, of different universe with different
physical laws. I think the idea is that we are so bound up with the laws of our
own universe that it gives the mere appearance that there was an intelligent
creator but we might ask “who made God?”
The idea of model-dependent realism suggests that the only
reality we can know is made of mental concepts thus all tests of reality are
model-dependent. In 1970 John Conway invented the Game of Life based on certain
simple laws that led to complexities and those complexities led to different
laws on different scales not unlike the different macro and micro (quantum)
laws we understand in physics. The takeaway is that: ‘reality is dependent on
the model employed.’ The authors note that we say that any complex being has
free will – “not as a fundamental feature, but as an effective theory, an
admission of our inability to do the calculations that would enable us to
predict its actions. Classical and quantum physics and other sciences as well
have indicated that our universe has both fundamental laws and apparent laws.
The second law of thermodynamics – that energy must remain constant and zero –
implies that any “body” of matter in the universe is stable because it has positive
energy.
“Because gravity shapes space and time, it allows space-time
to be locally stable but globally unstable. On the scale of the entire
universe, the positive energy of the matter can
be balanced by the negative gravitational energy, and so there is no
restriction on the creation of whole universes. Because there is a law like
gravity, the universe can and will create itself from nothing ….. Spontaneous
creation is the reason there is something rather than nothing, why the universe
exists, why we exist. It is not necessary to invoke God to light the blue touch
paper and set the universe going.”
“ … M-theory is the only
candidate for a complete theory of the universe. If it is finite – and this has
yet to be proved – it will be a model of a universe that creates itself.”
Wow, my brain hurts, but in a good way. Mind blown but yet
still holding together. Because gravity, blah, blah, blah. Great book!
No comments:
Post a Comment