The Grand Design by Stephen Hawking & L.Mlodinow: a critical review
Einstein’s elusive dream, the unified field theory, has become the grail of physics. In the conclusion to “A Brief History of Time”, Stephen Hawking calls the theory to-be-discovered the “ultimate triumph of human reason”, adding a metaphor that has been misconstrued as a statement of belief – “for then we would know the mind of God”. A decade later, the long-awaited Grand Design begins with the assertion that “philosophy is dead”, because it has not kept up with the advances of science, of which the goals are defined as the how and the why of the universe.
Having introduced the subject matter of the book, these being the laws of nature, Feynman’s ideas, M-Theory and the anthropic principle, which they restate as “we are the lords of creation”, the authors begin the exploration of this ambitious task by a guided tour of ancient philosophy, seeking the origins of modern science. Thales, Pythagoras, Archimedes, Euclid, Democritus, Anaximander, Aristarchus and others are examined in turn – for early insights into the atomic theory, the heliocentric model, evolution and mechanics, and Aristotle is censured for rejecting observation in favour of a priori “laws of nature”. These are formulated as axioms, based on their intellectual appeal rather than on fact, and often disagree with experiment. Oddly enough, Plato is not mentioned in the introduction – a matter that should be considered in the analysis of Grand Design’s conclusion. There follows a discussion on the laws of nature, as rejected by early Christianity, with reference to Kepler, Descartes, and Newton, and the attempt to reconcile these laws with the concept of God. Laplace is credited with the first statement of scientific determinism. The authors ask about the origin of these laws, whether there can be exceptions (miracles) and whether other sets of laws are possible. As to the question of free will, they circumvent this contradiction of determinism by stating that the outcome of the laws is determined in such a complicated way that it makes it impossible to predict human behaviour, and call free will an “effective theory”. Miracles are dismissed as impossible, and the issue of objective reality is raised.
The third chapter of the book bears the title “What is Reality?” – perhaps an allusion to “The Road to Reality”, a monumental tour de force of mathematics as used in modern physics by Roger Penrose, published five years ago. It would appear that philosophy is alive and well, despite earlier pronouncements, and the authors make their basic assertion that “there is no theory-independent concept of reality” and introduce “model-dependent realism”. This concept is not new, as in the equivalence of descriptive systems put forward in the eighties (and, much earlier, in relativity theory), yet it is claimed that model-dependent realism short-circuits the argument between realism and anti-realism. With reference to Ptolemy, Copernicus, Plato (!), Berkeley, Dr. Johnson and Hume, the authors advance the positivist view that it is pointless to ask whether a model is real, only whether it agrees with observation and claim that the issue of existence is solved, or at least avoided, by this approach.
It is worth noting at this stage that the authors had already asserted the impossibility of miracles as exceptions to the laws of nature, which they must therefore believe to exist, and also taken issue with the Roman Catholic Church, as in the history of Aristotelian dogma and the imprisonment of Galileo. Insofar as pope-bashing may be a la mode among people like Mr.Fry, a writer of Stephen Hawking’s standing could well be expected to take a more universal approach. There are no references to other world religions, even though Penrose’s recent book, the Cycles of Time, can be said to reflect the ideas of Vedic Hinduism. Buddhist philosophy lends itself easily to the understanding and description of quantum phenomena, the Void being described by Buddhists as “full, shining, radiant” – a parallel of the modern quantum vacuum concept from which, by way of fluctuations, the big bang is said to arise.
Comparing the Genesis to the standard model, the authors admit that neither model is more real than the other, and quote the age of the universe as 13.7 billion years according to the latter. Importantly, they define a good model by the criteria that it be elegant, with few arbitrary elements, in agreement with observations and with predictions that are testable. The constraint of “elegance” is crucial here, as science and mathematics, at their best, are creative activities: thus Wiles’ 200 page proof of Fermat’s last theorem(outstanding for 3 centuries) is correct but not elegant as compared to say E=mc^2
or the equations of Dirac. This constraint must be made to apply to the new theories presented subsequently by the authors – Feynman and M-Theory.
Stephen Hawking has a talent for presenting difficult ideas of science to the general reader with clarity, humour and lavish illustrations. Following the philosophical introduction, which spans almost three chapters, we find Stephen Hawking at his best taking the reader on a journey of exploration through quantum theory, the wave/particle duality, the uncertainty principle, and the Feynman sum over histories method, followed by a brilliant exposition on the unification theories of Maxwell, Einstein’s relativity as well as the currently established models of quantum electrodynamics (QED) and quantum chromodynamics (QCD). The unification grail of physics is well presented in terms of the attempts made so far to unify the four forces – gravity, electromagnetism, the weak force (transmutation of quarks, radiation, and creation of hydrogen in the early universe) and the strong force (nuclear binding of protons, neutrons and their quarks). It is at the point of creation that the four forces are unified and after the big bang they separate through symmetry-breaking. The authors present the reader with the progress of quantum versions of these elementary forces, starting with quantum electrodynamics (QED) by Feynman. The strong force now gives rise to quantum chromodynamics (QCD).The authors refer to attempts to unify these two but the results are at variance with experiment, however both form part, separately, of the standard model, well supported by evidence, and widely used in technology. Bringing gravity into the fold is almost impossible, and even a stand-alone quantum gravity is highly problematic.
This problem is similar to that which led to the formulation of quantum theory by Planck, called the “ultraviolet catastrophe”. Classical theory held that the power output of a radiating black body was proportional to the square of the frequency, and that at very high frequencies the output would tend to infinity, as the frequencies were continuous. This is in breach of what schoolboys used to call the 11th Commandment, “Thou shalt not divide by zero”, and infinities are to be avoided in a statement of a physical law. Planck’s quanta were discrete packets of energy at separate frequencies, and produced sensible results.
In what follows, the authors admit that all modern theories are plagued by infinities, even the Feynman sum-over-histories (where they are called “ultraviolet divergences”), and that the method of removing then, called renormalisation (cancelling out of infinities) is “mathematically dubious”. Although fuzzy, renormalisation works in the standard model but fails in the theory of gravity, because of Heisenberg’s Uncertainty Principle. This principle tells us that there are limits to our ability to simultaneously measure certain data, such as position and velocity of a particle, or the value of a field and its rate of change, governed by a simple equation which contains Planck’s constant. This incidentally forbids the concept of empty space, because then both the field and its rate of change would be exactly zero (and thus simultaneously known); thus space is never empty, having a state of minimum energy which we call a vacuum, where quantum fluctuations occur. The authors refrain from philosophical conclusions from the Uncertainty Principle, even though it signals a final limit to knowledge, but gloss over the supergravity theory (named after supersymmetry, which demands that mass particles should have partners which are force particles, being two facets of the same thing). This theory has the potential of renormalisation, but it is said that the required calculations were so long and difficult that no one was prepared to undertake them. Both supergravity and string theories lead to infinities, which represents failure under the criterion of elegance, as demanded at the outset.
Yet there is hope for science, not only via exploration through the Large Hadron Collider but also by the breakthroughs of the nineties, when dualities were discovered between the competing string theories and supergravity. Five different string theories, all in ten dimensions, with supergravity are said to be the 6 different ways of describing the same phenomena in our four dimensions, and are considered to be approximations to a more fundamental theory, called M-theory. Hawking says that it might not be possible to decipher the nature of M-theory, stating that the expectation of a single theory of nature could be untenable. M-theory is said to have eleven dimensions (ten of space and one of time, with extra dimensions “curled up” into internal spaces so small that we do not see them). This theory contains particles, strings, two dimensional membranes and other objects more difficult to picture, called p-branes, with p from zero to nine. M-theory is said by the authors to allow different universes with different apparent laws, depending on how the internal space is curled. Of these, there are about 10^500, a lot of universes. The authors then ask where that leaves us, and how did we end up in this universe?
Stephen Hawking is the originator of the no-boundary proposal. His discussion of the Feynman theory starts with the many worlds interpretation at creation, when the scale is extremely small. All quantum theories apply at the level of the very small. Insofar as the many worlds hypothesis is another method for calculating quantum events, as different from the older Copenhagen interpretation, the question should be asked whether it is legitimate to use Feynman’s approach at the macro level and arrive at the concept of the multiverse. In any case, a multiverse model is clearly not testable, as we cannot observe or detect another universe apart from our own.
Before we approach the conclusions of this remarkable work, it is worth noting that the physics of the new millennium is hard to discuss without reference to Lisa Randall (Warped Passages, 2005), the American physicist who had an opera based on her work, whom the authors do not mention, and the lack of a solid mathematical base for M-theory. In the words of its originator, Edward Witten, “we don’t know what the equations are” (2004), although there are plenty of published equations showing the dualities between the various string theories. Hence the lack of an appendix to this book where one would wish to find the equations of M-Theory.
The established theory of the big bang is reviewed by reference to Hubble’s discovery, in 1929, of the expansion of the universe, and the work of Einstein and Friedman, who predicted such expansion from zero size, as well as Lemaitre’s contribution. The discovery, in 1965, of a faint background of microwave radiation at 3 degrees Kelvin throughout the cosmos, and the abundance of hydrogen and helium, established the big bang model, with the addition, in the eighties, of the early inflation scenario. The faint cosmic radiation is now seen as the after-image of the big bang. For Stephen Hawking, the need for a very special initial state at the big bang is problematic, not only because of the strict requirements but also because, at the origin, there is no time dimension as yet. The authors circumvent this by the no-boundary condition, that is by the requirement that histories should be closed surfaces without boundaries, and point out that the slight variations in the background radiation can account for the inflationary scenario. They state that Feynman’s methods can be applied “to the universe as a whole”, leading to the concept of the universe appearing spontaneously, starting off in every possible way, most of which corresponding to other universes, some similar to ours. This leads to the proposed top-down approach: rather than describe the evolution of the universe from the starting point, “one should trace the histories backwards from the present time”. They conclude that the described theory is testable, without stating how.
The conclusion of The Grand Design is controversial. The authors revisit the anthropic principle in its strong form, which states that our existence imposes constraints upon the universe we inhabit as to environment and the form and content of the laws of nature, quoting Fred Hoyle’s contribution in the production of carbon, a basic requirement for life, in supernovae, called the triple alpha process which involves resonance, requiring extremely fine tuning of constants which Hoyle took as evidence of intelligent design. They then give instances of further fine-tuning requirements, such as 3 dimensions of space (as elliptical orbits are unstable in others), and the story of Einstein’s cosmological constant. This is the argument from design, sometimes known as the Goldilocks Principle – that things have to be just right. Yet the anthropic principle demands that not only the origin must be fine-tuned but also unique. The authors quote a Catholic Cardinal as asserting recently that “the immanent design in nature is real” and say that “this is not the answer of modern science”. Their answer begins with the statement that the fine-tunings in the laws of nature can be explained by the existence of multiple universes (one of which would be just right), without the need for a benevolent creator, and they point to the M-theory as the future of physics and cosmology.
In the final pages of The Grand Design, the authors illustrate spontaneous evolution by the Game of Life, a computer program designed by John Conway, based on three simple rules and a set of initial conditions. The evolution of this program exhibits different forms and behaviours on different scales, which the authors compare to the complex features of intelligent life. Yet complex features of an evolving system do not guarantee our universe with sentient beings as an end result.
Arguing from conservation of energy in our universe, the authors require the overall energy to be zero, to ensure local stability for the universe. As to the question posed at the beginning of the book – how can a whole universe be created from nothing – the authors answer that there must be a law like gravity. Since gravitational energy is negative (work must be done to separate a gravitationally bound system), this negative energy balances the positive energy required to create matter. They state: “Because there is a law like gravity, the universe can and will create itself from nothing”. In the closing paragraph we read that M-theory is the only candidate for a complete theory of the universe – if it can be proved to be finite- and that it is the unified theory that Einstein hoped to find, and if it is experimentally confirmed, “we will have found the grand design”.
Despite the brilliance of many expositions in this book, it is felt that the concluding part needs revision. Plato has been marginalised in the philosophical introduction, despite the fact that the forms and laws that Hawking and Mlodinov seek are platonic forms. The positivist approach demands testability. Although M-Theory is described as testable, there is no indication of how this could be done, and the authors could be accused of proposing metaphysics (as dismissed by A.J.Ayer) with the concept of the multiverse. This resultant concept is clearly not testable as we can only observe our own universe, and in a way limited by physical constraints such as light-speed. The application of the top-down approach is not demonstrated in the use of the Feynman method for macro events such as our own universe. The description of M-theory is indeed sparse. It deserves at least a whole chapter in order to be taken seriously. The authors do not state, for example, that the new version of the big bang is “when branes collide”, or that origins of string theory are related to music. There is a serious flaw in the lack of application of their own criteria to the theories put forward. Infinities in equations are certainly not elegant; model-dependent realism dictates that God and the law of gravity are synonyms for the origin of the universe, under the equivalence of descriptive systems. What is even more problematic is that their concluding speculations allow for varieties (10^500 approx) of different sets of universes with different laws of nature. Yet this leads to a contradiction as the axiomatic laws of nature, as referred to in the opening chapters would be seen to vary or be missing altogether, and without a law like gravity a universe could not arise.
Insofar as infinities are the bane of physicists, there is one instance where they could be useful. There is a calculation by Roger Penrose (The Emperor’s New Mind, ’89) where, illustrated by a cartoon of the Creator with a pin, aiming for the right point in phase space, Penrose in fact attempts to calculate the probability of the universe arising spontaneously. This side-steps the design argument completely and is in fact a reductio ad absurdum. To do this, Penrose invokes only two constraints: that the size of the universe be like ours, (taking Eddington’s baryon number as 10^80, which is probably too small), and that the second law of thermodynamics should hold. This second constraint is crucial as all laws of physics are time-reversible (symmetric) except the second law, which states that entropy (a measure of disorder) increases with time – and is often referred to as the arrow of time. Within this law, the big bang must have low entropy. Using the Bekenstein-Hawking formula for the entropy of a black hole, and tensor considerations, Penrose arrives at a probability of 1 in 10^10^123.He describes the size of this last number by saying that were we to write a zero on each proton, neutron and all other particles in the entire universe, we would still fall far short, and leaves the matter there.
This last number has been dubbed Big P, and is many orders of magnitude larger than the number of possible universes cited by Hawking and Mlodinow (10^500).Insofar as the calculation shows impossible (but not zero) odds for spontaneous creation, it can be pursued further, for we can say that it shows the odds for some universe, but not necessarily ours. We can argue that in asking that the chosen universe be exactly ours, we would have to continue the calculation, ending perhaps with Big P^Big P on the macro level. Yet two events can appear identical on the macro level, differing on the quantum scale. It can be seen that this continuation could lead to an infinity, in which case the probability of spontaneous creation would be exactly zero, the 11th commandment notwithstanding.
Stephen Hawking and Leonard Mlodinov are brave souls, venturing into unchartered territory. Let them beware that siren call from across the water, and relentlessly seek the new geometry.