The multiverse concept is often derided as “unscientific” and an example of physicists indulging in metaphysical speculation of the sort they would usually deplore. For example, commenters here at Scientia Salon have said that the multiverse is “by definition not verifiable and thus outside the bounds of empirical science,’’ and that “advocates of multiverses seem to be in need of serious philosophical help” .
Critics thus claim that the multiverse amounts to a leap of faith akin to a religious belief. Indeed, the religious often accuse atheistic scientists of inventing the multiverse purely to rebut the “fine-tuning” argument that they say points to a creator god, though the fine-tuning argument is readily refuted in several other ways, and anyhow physicists really don’t care enough about theology these days to let that worry them; further, the concepts leading to a multiverse were developed well before theologians started taking note of the issue .
The purpose of this article is to argue that the multiverse is an entirely scientific hypothesis, arrived at for good scientific reasons and arising out of testable and tested cosmological models. To be clear, I am not asserting that the multiverse has been proven true, even on the balance of probability, but I am asserting that it is a serious scientific concept that will eventually be accepted or rejected on scientific grounds.
Several different concepts could be labelled a “multiverse”, but I am advocating one particular multiverse concept, that arising from what cosmologists call the “eternal inflation” version of Big Bang cosmology , which was developed to explain observations of our universe and predictions from this model have since been verified, putting it on a sound footing.
I am arguing that if a scientific theory predicts consequences A, B, C and D, and if we then verify that A, B and C are indeed the case, thus giving us confidence in the theory, then we have sound reasons for accepting D even if D cannot be directly verified. Indeed, we would be obliged to accept D unless we can construct another equally good explanation of A, B and C.
The Big Bang model
As long ago as Isaac Newton, people realized that a static universe doesn’t actually work. Stars and galaxies would fall together under gravity and thus they can no more be static than you could float a brick in mid air. But scientists ignored this — even Einstein rigged his equations to get a static universe — until Edwin Hubble produced empirical evidence that the galaxies are moving away from us. Observations tell us that our universe is expanding, and expanding in a uniform fashion as though space itself is expanding, carrying the galaxies along with it.
If one runs a uniform expansion backwards in time — say repeatedly halving all scale lengths and separations — eventually one arrives at a state with everything in the same place. However, we expect our understanding of physics to have broken down before any such “singularity,” and in particular we expect “quantum gravity” effects to dominate at a scale length of 10-35 meters, called the “Planck length’.’
We do not have a working theory of quantum gravity, but we know enough about quantum mechanics to suppose that our universe originated with a scale-length of 10-35 meters as a quantum fluctuation in a pre-existing ”state.” Understanding this process, and knowing about the pre-existing state in which the quantum fluctuation might have occurred, are not needed for the multiverse model that I expound below. However, where one quantum fluctuation can occur so can another, and thus it is natural to suppose that there might be many other such universes. In particular, we have no strong reason to suppose that the quantum fluctuation that originated our universe was the origin of all things or of time itself — though equally we lack arguments against those possibilities, and if you want to argue for them then go ahead .
The characteristic length of 10-35 meters leads to a characteristic time, being the time taken to cross that length at a speed limited by the speed of light, c, the fastest speed at which information can be transferred from one region to another. The value of c of 108 meters/second then gives a “Planck time” of 10-43 seconds. This in turn leads to the concept of an “observable horizon,” being the furthest distance (= ct) from which light/information can have traveled to us given an age, t, of the universe. Thus we conceive of our universe, having originated as a quantum fluctuation, obeying its natural tendency for length scales to expand (that being the solution of Einstein’s equations of General Relativity), and with the observable horizon also expanding as the universe ages.
Conventional cosmological models presume this universe to be spatially infinite in all directions (or, at least, so much larger than the observable horizon that we don’t have to worry about the observational consequences of any “edge”). Is there an alternative? One could argue that our original quantum fluctuation would have had some finite extent, and thus that our current universe would now have a much larger but still finite extent, and that somewhere beyond the observable horizon is a boundary to another “universe” originating in a separate quantum fluctuation, or to the “pre-universe” out of which our quantum fluctuation arose. Alternatively, one could have a “wrapped-around universe,” where space has a finite extent but one can travel forever in any direction, like ants walking on the surface of a sphere.
Observationally, the “curvature” of space in our universe is known to be very small (our universe is “flat” to 1% accuracy), and thus the scale of the wrap-around would have to be much bigger than our observable horizon. Still, if you wanted to argue that the original quantum-gravity fluctuation was self-contained, rather than being in a pre-existing state, then one could argue for a finite, wrapped-around universe along these lines .
The multiverse model is often criticized as “unscientific” for invoking universes that can never be seen and thus making claims that can never be verified. But this applies just as much to all cosmological models, which are usually presumed to extend to infinity. All of them are thus postulating that the universe stretches well beyond the observable horizon, from where (owing to the finite speed of light) we can never obtain information to verify any hypotheses. This feature does not make the model unscientific. If we are simply using principles of parsimony to postulate more-of-the-same, beyond where we humans can personally see it, then we’re being entirely scientific.
Is the above account supported by evidence? Yes, very much so. The expansion of space is observed in the red-shift of light from distant galaxies (caused by the expansion of photons’ wavelengths as they travel through expanding space). In addition, the Big Bang model predicts exactly what primordial elements were created in the expanding fireball of the first 1000 seconds. The abundances of Hydrogen and Helium-4, and of traces of Deuterium, Helium-3, and Lithium-7 are predicted to high accuracy — and when observers measured these abundances to test the Big Bang predictions they found an excellent match.
Further, the Big Bang model predicts the existence of “cosmic microwave background” (CMB) radiation, left over from when the ionized universe cooled enough to allow neutral atoms, 380,000 years after the Big Bang. After this prediction had been made the radiation was then found, exactly as predicted, stretched out into the microwave band by the subsequent expansion of space, leading to the first of three Nobel Prizes so far awarded for work on Big Bang cosmology .
The study of the CMB, a snapshot of how the universe was soon after the Big Bang, is now a major part of cosmology. Indeed, cosmology is on a sound empirical footing precisely because we can directly see what the early universe was like; since that time the universe has been transparent and thus we can observe photons that last interacted with matter in the early universe. Modelling the tiny temperature fluctuations in the nearly-smooth CMB produces strong constraints on cosmological models. As measurements have improved, from the COBE satellite to NASA’s WMAP and recently ESA’s Planck satellite, at each step better data could have produced results incompatible with the Big Bang models, but at each step the concordance between theory and observation has got better and more remarkable.
[next time: Inflation and the constancy of physical constants]
Coel Hellier is a Professor of Astrophysics at Keele University in the UK. In addition to teaching physics, astrophysics, and maths he searches for exoplanets. He currently runs the WASP-South transit search, finding planets by looking for small dips in the light of stars caused when a planet transits in front of the star. Earlier in his research career Coel studied binary stars that were exchanging material, leading up to his book about Cataclysmic Variable Stars.
 See comments to this Scientia Salon article.
 See my blog post for why the “fine tuning” argument does not lead to any deity.
 Thus I am ignoring here the issue of a multi-dimensional multiverse possibly arising out of M-theory.
 See Lawrence Krauss’s book A Universe from Nothing for an example of this argument.
 When dealing with general relativity and quantum gravity the notions of time and space depend on the observer, so in discussing whether the universe has a finite extent we should specify where we’re observing from. A black hole could look finite from the outside but infinite from the inside.
 These being the 1978 prize to Penzias and Wilson for the discovery of the CMB, the 2006 prize to Mather and Smoot for the discovery of structure in the CMB, and the 2011 prize to Perlmutter, Schmidt and Reiss for the discovery of “dark energy” through the acceleration of distant supernovae. If the BICEP2 result holds up there is a likelihood of a fourth prize for the idea of inflation.