
Is There a Simple Proof For a Vast Multiverse?
Season 11 Episode 10 | 16m 18sVideo has Closed Captions
Is there evidence for the existence of an enormous number of other universes?
In 1987, Steven Weinberg wrote a cute little paper entitled “Anthropic Bound on the Cosmological Constant”. Weinberg was foundational in establishing the standard model of particle physics, but his little 1987 paper, though more obscure, may tell us something about how the multiverse works, and can even be thought of as evidence for the existence of an enormous number of other universes.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

Is There a Simple Proof For a Vast Multiverse?
Season 11 Episode 10 | 16m 18sVideo has Closed Captions
In 1987, Steven Weinberg wrote a cute little paper entitled “Anthropic Bound on the Cosmological Constant”. Weinberg was foundational in establishing the standard model of particle physics, but his little 1987 paper, though more obscure, may tell us something about how the multiverse works, and can even be thought of as evidence for the existence of an enormous number of other universes.
Problems playing video? | Closed Captioning Feedback
How to Watch PBS Space Time
PBS Space Time is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipIn 1987, Steven Weineberg wrote a cute little paper entitled anthropic bound on the cosmological constant.
I say cute and little because this work feels minor in comparison to say electroeak unification theory that won him the Nobel Prize.
Weineberg was foundational in establishing the standard model of particle physics and represented an enormous leap in our understanding of how this universe works.
But his little 1987 paper, although more obscure, may tell us something about how the multiverse works and could even be thought of as evidence for the existence of an enormous number of other universes.
In our last episode, we asked whether it's bad science to hypothesize a multiverse.
We concluded that it very much depends on where the particular multiverse comes from.
But in general, we concluded that if a theory predicts a multiverse but isn't itself concocted with the intention of producing a multiverse, then we can't just necessarily dismiss the theory on aesthetic grounds, for example, for a violation of Okam's razor.
But that doesn't help with the testability of the hypothesis.
If we can't ever travel to or receive a signal from another universe, how do we know they exist?
Falsifiability is another hallmark of good scientific hypothesis.
And so the multiverse is in danger on that front.
But actually, there are ways to test the multiverse hypothesis without ever leaving our universe.
Our universe has several properties that seem particularly well suited or finely tuned for the formation of life.
If the binding energy of helium was a bit lower or a bit higher, then we'd have either only hydrogen or no hydrogen, respectively.
If a very specific nuclear energy level of the carbon 12 nucleus was a bit different, stars wouldn't produce an abundant supply of this life critical element.
Another example of an apparent fine-tuning that we've talked about recently is the small mass of the Higgs Boson, which for reasons we discussed and will review seems oddly tinsy.
Related to this is the smallest of the strength of gravity compared to the other forces.
If either the Higs mass or the strength of gravity were larger, matter would scrunch up into blobs too dense to form interesting structures or just into black holes.
The separation in scale between the Higgs mass and the plank scale and the weakness of gravity and the other forces are both examples of the hierarchy problem.
Another example is the apparently very tiny value of the cosmological constant.
The cosmological constant is a number that appears in the Einstein field equations of general relativity which if positive causes accelerating expansion of the universe.
The expansion of the universe is indeed accelerating.
And so the cosmological constant is indeed positive.
We call this effect dark energy.
And while we don't know what it really is, the most routine explanation is that empty space has a constant energy density.
This is sort of maybe expected due to the Heisenberg uncertainty principle.
Even a perfect vacuum can't be considered to have exactly zero energy.
and positive energy density of the vacuum has the effect of causing accelerating expansion.
We've now seen that our universe is accelerating which means that we know that something like this vacuum energy exists and we know how strong it is.
In the modern universe, there's more of this dark energy than all of the other forms of energy combined by a factor of nearly 2 and a half.
That sounds like a lot, but it corresponds to a minuscule energy density of empty space.
The only reason dark energy is so dominant is that after nearly 14 billion years of expansion, there's just so much empty space that the dark energy adds up.
Okay, so our universe seems to have this tiny energy density of the vacuum.
If we go ahead and use admittedly naive methods to calculate what the energy of the vacuum should be based on the expected activity in the known quantum fields, we get a number that's quite a bit larger than we observe, 120 orders of magnitude larger.
While this has been called the worst prediction in physics, we don't really presume that we can predict the energy of the vacuum using current theories.
The quantum field theory used to do this calculation is incomplete.
It breaks down at very high energies.
Now, it's generally assumed that other effects at play at higher energies than we've tested work to suppress the vacuum energy, but we don't know how.
Perhaps you recall the recent episode where we talked about how a naive attempt to calculate the mass of the Higgs particle using quantum field theory gives a similar problem.
a very large or even an infinite Higs mass.
It's really the same issue and perhaps it has the same solution.
One way to get both the Higgs mass and the cosmological constant down to the tiny values that we observe in this universe is for the various effects of the quantum fluctuations to cancel each other out.
We know this happens with other particles like the electron in which the symmetry connected to the existence of antimatter sort of eliminates higher energy terms in the electron's mass.
The hypothetical particles of super symmetry were meant to do the same for the Higs, but we're having a hard time finding them.
Absent such a perfect or near-perfect cancellation due to an underlying symmetry, it seems that cancellation has to be kind of random.
But to randomly cancel a bunch of uncorrelated effects down to one part in 10 the^ of 120 in the case of dark energy really strains belief.
Getting that result by chance has the same probability as flipping heads on a fair coin 400 times in a row.
So if there was only ever one big bang and the cosmological constant could have landed anywhere in these 120 orders of magnitude, it feels lucky that it landed where it did.
Very lucky in fact because if the cosmological constant was much higher, the universe would have inflated so quickly that no stars or galaxies could ever have formed.
And if the Higgs mass had been much larger, all matter would have collapsed under its own weight.
Either way, we wouldn't have scientists to wonder why these numbers have the values that they do.
We can get around this using the anthropic principle by noting that if you have enough big bangs, eventually you'll get one with the parameters you need to form scientists.
So, at least 10 the^ of 120 of them, if that's really the possible range for the cosmological constant.
Many scientists hate this use of the anthropic principle in a vast multiverse because it seems like an explanatory dead end, even if it's right.
They'd rather be able to calculate the value of the cosmological constant from first principles.
But there's no guarantee that this is even possible.
And actually, it turns out that we can at least constrain its possible value by using anthropic arguments.
So back to Steven Weinberg's paper, entropic bound on the cosmological constant is aptly named in it.
Weineberg uses anthropic arguments to calculate the value of the cosmological constant or at least what it should have if its value in our universe results from a selection effect in a multiverse of many different cosmological constants.
The argument goes like this.
Assume there are many universes with different cosmological constants and only a small fraction of those have values amenable for the development of life.
We are of course in one of those.
This is the standard argument of the anthropic principle.
But Weineberg goes much further.
He uses this argument to actually estimate what that constant should be for this universe.
And the key to doing this is the use of the principle of mediocrity.
It says that if an object or phenomenon is randomly selected from a collection of various categories, it is statistically more likely to originate from the most numerous categories rather than from the less common ones.
So we can divide up possible universes into categories based on the size of their cosmological constant.
Many multiverse generating theories predict a fairly even distribution for this value.
what we call a flat prior.
Meaning there are more categories with cosmological constants much much larger than the tiny one in our universe.
The mediocrity principle tells us that if we choose a random universe out of the entire multiverse, it should have a non-special value for that cosmological constant.
So it should have a large one.
But if we choose a universe that we know has observers in it, we add an extra constraint.
We're now selecting from a subset of the multiverse with a much smaller cosmological constant.
But besides the anthropic constraint, the cosmological constant in this subset should be as non-special as possible within the constrained range.
So says the mediocrity principle.
In other words, it should not be a lot smaller than it needs to be to allow observers.
If the cosmological constant is much smaller than is strictly needed, that might be taken as evidence against the anthropic explanation.
Okay.
The next step then is to calculate the maximum strength dark energy could have before life becomes impossible.
The principle of mediocrity says that dark energy shouldn't be too much smaller than that.
Steven Weineberg did this by figuring out the maximum dark energy strength that would still allow life supporting structures to form.
We know that stars and galaxies formed when overdense regions of gas and dark matter that filled the early universe collapsed under their own gravity.
But this gravitational inpole had competition.
The expansion of the universe meant that over time matter was thrown further and further apart and beyond a certain distance, gravity becomes too weak to drag that material back together.
It was this balance between impuling gravity and outping expansion that determines the largest sizes of structures that could form.
In the case of our universe, that's galaxy clusters.
In our universe, dark energy started to dominate relatively late, kicking off the accelerating expansion 6 or 7 billion years ago.
But importantly, after the universe had already built galaxies and clusters.
But if you dial up the strength of dark energy, then it starts to dominate the universe earlier.
And if you dial it up too much, then the accelerating expansion starts early enough to mess with galaxy formation.
You can calculate a pretty straightforward limit to how strong dark energy can be to even get proper galaxies.
So let's add one more factor.
Clearly, we need stars to live, but we do also need galaxies.
In order to have planets, multiple generations of stars need to have existed prior to our own because those earlier stars created the elements needed to build planets.
So we need star making factories which means galaxies or at least small ones for example a globular cluster with a million stars might be enough.
Weineberg calculated that the maximum amount of dark energy for which such systems could form is around 500 times the energy that is in matter if extrapolated to the modern era.
Now the true value is that dark energy is about 2.3 times larger than the energy in matter but 2.3 is quite a bit smaller than 500.
So from this calculation our universe still has an unusually small cosmological constant compared to the set of all universes that could have produced life or at least galaxies.
A couple of things here.
First is that this brings the chance of our cosmological constant being so small down from 1 in 10 the^ of 120 to 1 in 200 which is a huge improvement.
By adding the anthropic selection, it brings the random chance hypothesis into the realm of possibility.
Still, if the anthropic argument doesn't fully solve the problem, it's not exactly a parimonious solution.
The mismatch gets worse if we update some of the observations that went into Weineberg's calculation.
He used our most distant observations of massive objects to determine the maximum time allowed to form massive structures in 1987.
Those were quasars that existed when the universe was around 10% of its current age.
But we now know that there were massive structures much earlier.
For example, the James Webb Space Telescope has found galaxies at around 2% of the universe's current age.
Those structures formed in an era when dark energy was truly negligible, indicating that dark energy could be quite a bit larger than it is in this universe while still allowing at least some galaxies to form.
But there's another refinement to Weineberg's calculation that can make sense of this.
The way Weineberg used the mediocrity principle was to say that we should be in the most typical type of universe that can form galaxies.
But a more powerful approach might be to apply the mediocracy principle to the class of observer rather than the class of universe.
This idea is sometimes called the self-sampling assumption which is that all other things being equal, an observer should reason as if they are randomly selected from the set of all actually existent observers past, present and future in their reference class.
For example, we might expect that we humanity are a fairly typical example of all the species across the multiverse that can produce scientists capable of anthropic reasoning.
Some universes may produce many more such species than others.
And so we might expect to find ourselves in one of those.
We could also argue that this would be in a universe that has more and bigger galaxies.
And more and bigger galaxies will form when dark energy is weaker.
In fact, if scientists producing civilizations are rare enough, you might need a universe with lots of bigger galaxies to even have a chance of producing some.
All of this pushes the anthropic preference to smaller values of the cosmological constant.
To really know whether anthropic selection is the reason behind our small cosmological constant or indeed any of the other apparently finely tuned parameters of our universe, we need a much better understanding of all the contingencies behind the development of life.
But it's still remarkable that Weineberg was able to guesstimate the strength of vacuum energy to within a relatively small range only based on anthropic arguments.
And this is the real kicker.
When Weineberg made this calculation, we didn't know that dark energy even existed.
The accelerating expansion was discovered in the late '9s, a decade after Weineberg's paper.
Back then, most people thought that the cosmological constant was almost certainly zero, indicating a perfect cancellation by unknown symmetries.
The fact that Weineberg even came within the ballpark of the right number before that number was known is seen by some as a powerful argument for the anthropic selection.
And let me be clear, the anthropic argument only works if there's a very large multiverse.
Which is why some might take this as a point in favor of the multiverse hypothesis.
Because if all universes are possible, then we need a lot of them to explain our not too rapidly expanding spaceime.

- Science and Nature

A series about fails in history that have resulted in major discoveries and inventions.













Support for PBS provided by:

