Quantum Physics

25
Apr

Is Information Fundamental?

What if the fundamental “stuff” of the universe isn’t matter or energy, but information?

That’s the idea some theorists are pursuing as they search for ever-more elegant and concise descriptions of the laws that govern our universe. Could our universe, in all its richness and diversity, really be just a bunch of bits?

To understand the buzz over information, we have to start at the beginning: What is information?

This is.

So is an image like this:

info_booth_620
Credit: Les Taylor/Flickr, under a Creative Commons license.

And an equation like this:

equation_wide

“It doesn’t matter whether something consists of equations, words, images or sounds—you can encode any of that in strings of zeroes and ones,” or bits, says Scott Aaronson, associate professor of electrical engineering and computer science at MIT. Your computer is doing it right now, using tiny magnets, capacitors, and transistors to store billions or trillions of binary digits. “These might have been hard concepts for people to understand a century ago, but because of the computer revolution, we deal with these concepts all the time,” says Aaronson. In an age when a USB drive dangles from every keychain and an iPhone strains the seams of every pocket, it isn’t such a leap to agree that anything can be expressed in information.

To some theorists, though, information is more than just a description of our universe and the stuff in it: it is the most basic currency of existence, occupying what theorist Paul Davies terms the “ontological basement” of reality.

The rules of quantum information provide the most “compact” description of physics, says Vlatko Vedral, professor of quantum information theory at the University of Oxford and the National University of Singapore. “Information, it seems to me, requires fewer assumptions about anything else we could postulate. As soon as you talk about matter and energy, you have to write down the laws that govern matter and energy.”

Does this mean that our universe is made of information, as some headlines claim?

“It strikes me as a contentless question,” says Aaronson. “To say that matter and energy are important in physics is to say something with content.” You can imagine a universe barren of matter and energy, after all; specifying that our universe is furnished with both tells you something about it and distinguishes it from other possible universes. “But I don’t know how you could even conceive of a universe” without information, he says.

Yet, as a fresh way of thinking about, well, what the universe is about, information has touched off provocative work in computer science and theoretical astrophysics, apparently disparate fields that may share a deep link manifested by that cosmic Rosetta stone, the black hole. But before we dive into the black hole, let’s step back to take a deeper look at information itself.

All messages contain information, but not all messages are created equal. “Unexpected things have high information content,” says Vedral. Take a sunrise, for example. “If the sun rises tomorrow, you won’t see any newspaper writing about it. But of course if it didn’t rise, it would be a major event.”

We sense intuitively that “surprises,” like a missed sunrise, carry more information than routine events. Claude Shannon, widely considered the father of information theory, formalized this intuition by defining a quantity that’s now known as “Shannon entropy.” The Shannon entropy of a message is related to the sum of the logarithm of the probability of each bit taking on a particular value. That’s a mouthful, but, Vedral explains, it mathematically captures two important features of information: the value of surprises, and the fact that information is “additive”—that is, that the total information contained in two, three, four, or a billion unrelated events is equal to the sum of the information in each one.

Physicists describe entropy a little differently, often speaking in terms of the “disorder” of a system. More precisely, entropy is the number of different ways you can rearrange the littlest parts of a system and still get the same big system. A bucket full of red Legos, for instance, has high entropy. Shake it up, spin it around, and you still have what you began with: a bucket of red Legos. Assemble those same blocks into a Lego castle, though, and you’ve slashed the entropy; moving a single block nets you a different “macroscopic” system.

Whichever perspective you choose, the essential result is the same. Take the paragraph you’re reading right now, for instance, with its many different letters, punctuation marks, and spaces arranged in a very particular order. It contains more information, and thus has higher entropy, than a paragraph like this one—

aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaa

—even though they contain the same number of characters. Entropy, then, provides a measure of not just the length but the information content of a message.

Now imagine trying to build the ultimate hard drive, one that holds the maximum amount of information allowed by physics. Why should physics place a limit on the information storage capacity of this hypothetical hard drive? Thinking it over from a purely classical perspective, it seems that you could store an infinite amount of information. But when we add quantum mechanics to the mix, we introduce fundamental limits on the accuracy of our measurements. These limits cause entropy to max out at about 1069 bits per square meter. “If you tried to pack information more densely than that,” says Aaronson, “your hard drive would collapse into a black hole.” That’s not just a whimsical footnote. Black holes, it turns out, are the universe’s very best information repositories. (Of course, Aaronson points out, they don’t make very practical hard drives, unless you’re willing to wait about 1070 years to retrieve your data.)

But there’s something odd about the way the entropy of a black hole grows. As physicists Stephen Hawking and Jacob Bekenstein discovered in the 1970s, the entropy of a black hole increases with the black hole’s two-dimensional surface area, as defined by an imaginary spherical shell with radius Rs. This is bizarre; you would expect the amount of information you can pack into any object, like a book or a hard drive, to grow with the three-dimensional volume of the object, not its surface area.

This discrepancy is more than just theoretical arcana. To physicists, it suggests that the fundamental laws of physics may have a simpler representation in two dimensions rather than the traditional three. In 1997, the Argentinian physicist Juan Maldacena, now at the Institute for Advanced Study, took advantage of this idea to work out a mathematical “duality” between our universe and one with fewer dimensions, one time dimension, and no gravity. This provides a handy mathematical back door; problems that are difficult to solve in one domain may shake out easily in the other.

To some theorists, the duality isn’t just mathematical. The universe as we experience it, they say, may actually be the projection of information encoded on some distant cosmic boundary. Where this boundary lies and how the projection occurs are still open questions, but these theorists argue that our reality may be, in essence, a hologram analogous to the silvery images on museum store postcards.

We have information theory to thank for this peculiar plot twist. But if it’s hard to imagine a practical application for this “holographic principle,” it’s far easier to see how quantum information is changing computing. That’s because quantum information has different basic properties from classical information. The bits that make up classical information can be either one or zero. But the “qubits” that make up quantum information can exist in a superposition of the “one” and “zero” states; in a sense, they can take on both states at once. To maintain this superposition, though, qubits must exist in perfect isolation. As soon as that isolation is disturbed, the superposition crumbles.

“Quantum information is like the information in a dream,” explained Charles Bennett, a quantum information scientist at IBM Research, in a recent talk at the annual meeting of the American Association for the Advancement of Science. “In describing it, you change you memory of it.” This may not sound like a desirable quality in a computer, but in combination with entanglement, it can be exploited to dramatically speed up certain types of calculations and to send perfectly secure encrypted messages. As Steve Girvin, a theoretical physicist at Yale University, points out, it can also be used to generate genuinely random numbers suitable for encryption keys. Quantum cryptography is already being used commercially for some bank transfers and other highly secure transmissions.

“This second quantum revolution—the revolution of information—is a complete surprise,” says Girvin. “It took decades to come to grips with the weirdness and realize that the information of quantum mechanical systems is different than the information content of classical systems, and being uncertain about something can actually be good instead of bad.”

Quantum information is useful stuff—but what is it telling us about the essential nature of our reality? Some thinkers argue that it suggests our entire universe is itself a quantum computer. “I like this image,” says Vedral, while admitting that the analogy is imperfect. “You could ask, can I treat the rest of the universe as something that I can program in the way that I program my [ordinary] computer?” No, says Vedral. “You’re still limited by the laws of physics, and you have a certain amount of resources which are finite. There are computations you will never be able to execute.” The computation at which this cosmic quantum computer is uniquely capable is that of computing its own evolution.

Whether information is a useful strategy of thought or something deeper, we still don’t know. “We’re still struggling with what our theories are really telling us,” says Vedral. “You have to take a leap of imagination.”

Go Deeper
Editor’s picks for further reading

arXiv: Computational capacity of the universe
Treating the entire universe as a computer, MIT “quantum mechanic” Seth Lloyd derives the information storage and operational limits of the universe.

FQXi: It From Bit or Bit From It?
Read prize-winning essays from the Foundational Questions Institute’s 2013 essay contest on the theme of information and its role in reality.

Information and the Nature of Reality
With contributions from physicists, philosophers, theologians, and biologists, this volume of essays looks at the meaning and significance of information from multiple perspectives.

Tell us what you think on Twitter, Facebook, or email.

kbecker-big

Kate Becker

    In a parallel universe, Kate Becker is senior researcher for NOVA and NOVA scienceNOW and a blogger for Inside NOVA. In this universe, she is your host here at The Nature of Reality, and it is her mission to blow your mind with physics. Kate studied physics at Oberlin College and astronomy at Cornell University. You can also follow her on Twitter and Facebook.

    • http://www.technologers.com/ Sachin Yadav

      nice article

    • uuberdude

      No need to string this out (pun intended). At and beyond the boundary of experience does there exist potential? On the side that we call observable, which we might rightly call “this” side, there is the known universe. Are those energies and masses, which at some point are indistinguishable, that exceed E=MC squared from a given point in spacetime no longer observable? If they travel beyond that speed does it necessarily follow that they are or are notexistent?
      At the end of the observable universe, given that it is flying apart at an increasing rate from all points in space, even potentially past that which is observed at a given locus a and seen from any given point, past what is in our experience red shifted in the extreme, do matter and energy exceed our ability to observe it past that point?

    • rrtucci

      There are a lot of meaningless, informationless statements in the blog post

    • Information Guru

      Is Information Fundamental? Actually, the Universe is composed of Information.
      The Fundamental Axiom of The General Theory of Information (TGTI).
      There are two Universal Information Genera (UIG). They are Conceptual Information (CI), which is generated by Brains / Minds. The second is Physical Information (PI),
      which includes Brains and every other PI Structure and PI Algorithm possible in the Universe. [ (Brain) (PI Structure) ] [ (Mind) (CI Structure) ]
      PICSC – PI Complexity Space Continuum. CICSC – CI Complexity Space Continuum.
      MICSC – MI Complexity Space Continuum. (MI – Mathematical Information).
      The MICSC is a Proper Subset of the CICSC.
      The First Three Subtheories of TGTI are:
      The Theory of Mathematical Information (TTMI). MI Structures and MI Algorithms.
      The Theory of Physical Information (TTPI). PI Structures and PI Algorithms.
      The General Theory of Algorithmics (TGTA) – The El Dorado of twenty-first century
      Computational Science.
      I utilized these three Subtheories to demolish the [P=NP?] problem and
      the [P/NP/NPC] Computational Science Paradigm in 2008. I published my first book in
      October of 2009. An Information Triad and The General Theory of Information.
      I sent copies to the The Group of NIne – nine professors of mathematics and computational science, and other sciences, in the Academic Scientific Community.
      TGTI has thousands of Subtheories. I have written two more books and hundreds of papers. TGTI is an enormously powerful theory of our Universe, composed of Information. (PI, CI and MI).
      Why haven’t you heard of any of this? That’s a long story about the Academic
      Scientific Community (ASC), The Global Information Industry (GII), The Information Moat Policy Trap (IMP Trap), and the Concept of Scientific Information Inertia (SII).
      Want to know more? Contact me.

      • Hysen Berisha

        How easy it is to copy and paste information. If you’re not an academic please don’t open your mouth and involve your self in a intellectual conversation.

    • Serge Miatovich

      The idea of Nature being informational can be turned into viable mathematical theory. A theory like this can generalize Relativity, QM and Gravity and provide new predictions.
      For more information, please visit: http://newkindofphysics.com/

    • El Ché (The Man)

      The secret to the universe and the unified theory can be identified by one word…
      Can you guess that word? Here is a hint… It starts and ends with a ‘t’.