Thought Experiments

04
Dec

Living Bits: Information and the Origin of Life

What is life?

When Erwin Schrödinger posed this question in 1944, in a book of the same name, he was 57 years old. He had won the Nobel in Physics eleven years earlier, and was arguably past his glory days. Indeed, at that time he was working mostly on his ill-fated “Unitary Field Theory.” By all accounts, the publication of “What is Life?”—venturing far outside of a theoretical physicist’s field of expertise—raised many eyebrows. How presumptuous for a physicist to take on one of the deepest questions in biology! But Schrödinger argued that science should not be compartmentalized:

“Some of us should venture to embark on a synthesis of facts and theories, albeit with second-hand and incomplete knowledge of some of them—and at the risk of making fools of ourselves.”

Schrödinger’s “What is Life” has been extraordinarily influential, in one part because he was one of the first who dared to ask the question seriously, and in another because it was the book that was read by a good number of physicists—famously both Francis Crick and James Watson independently, but also many a member of the “Phage group,” a group of scientists that started the field of bacterial genetics—and steered them to new careers in biology. The book is perhaps less famous for the answers Schrödinger suggested, as almost all of them have turned out to be wrong.

flower_620
Image: Flickr user Tau Zero, adapted under a Creative Commons license.

In the 70 years since the book appeared, what have we learned about this question? Perhaps the greatest leap forward was provided by Watson and Crick, who by discovering the structure of DNA ushered in the age of information in biology. Indeed, a glib contemporary answer to Schrödinger’s question is simply: “Life is information that can copy itself.” But this statement offers little insight without a more profound analysis of the concept of information in the context of life. So instead of forging ahead, let’s take a step back instead and first ask: What is information?

The meaning of information

Information is a buzzword that is used in the press and in everyday conversation all the time, but it also has a very precise meaning in science. The theory of information was crafted by another great scientist, the mathematician and engineer Claude Shannon. Without going into the mathematical details, we can say that information is that which allows the holder of that information to make predictions, with accuracy better than chance.

There are three important concepts in this definition. First: prediction. The colloquial use of “information” suggests “knowing.” But more precisely, information implies the ability to use that knowledge to predict something. The second important aspect of the definition is the focus on “something other,” which reminds us that information must be about something. The third and last part concerns the accuracy of prediction. I can easily make predictions about another system (say, the stock market), but if these predictions are only as good as random guessing, then I did not make these predictions using information.

One thing that the stock market example immediately suggests is that information is valuable. It is also valuable for survival: For example, knowledge enabling you to predict the trajectory of a predator so that you can escape it is extremely valuable information. Indeed, it is possible to think of the entirety of the information stored in our genes in terms of the predictions it makes about the world in which we find ourselves: how to make a body that uses the information so that it can be replicated, how to acquire the energy to keep the body going, and how to survive in the world up until replication is accomplished. And while it is gratifying to know that our existence can succinctly be described as “information that can replicate itself,” the immediate follow-up question is, “Where did this information come from?”

The hardest question in science

Through decades of work by legions of scientists, we now know that the process of Darwinian evolution tends to lead to an increase in the information coded in genes. That this must happen on average is not difficult to see. Imagine I start out with a genome encoding n bits of information. In an evolutionary process, mutations occur on the many representatives of this information in a population. The mutations can change the amount of information, or they can leave the information unchanged. If the information changes, it can increase or decrease. But very different fates befall those two different changes. The mutation that caused a decrease in information will generally lead to lower fitness, as the information stored in our genes is used to build the organism and survive. If you know less than your competitors about how to do this, you are unlikely to thrive as well as they do. If, on the other hand, you mutate towards more information—meaning better prediction—you are likely to use that information to have an edge in survival. So, in the long run, more information is preferred to less information, and the amount of information in our genes will tend to increase over time.

However, this insight does not tell us where the first self-replicating piece of information came from. Did it arise spontaneously? Now we find ourselves faced with the question that some have called “The hardest question in science.”

At first glance it might appear that this question cannot possibly be answered, unless the class of molecules that gave rise to the first information replicator has left some traces in today’s biochemistry. Different scientists have different opinions about what these molecules might have been. But there are some things we can say about the probability of spontaneous emergence without knowing anything about the chemistry involved, using the tools of information theory. Indeed, information does not change whether it is encoded in bits, in nucleotides, or is scratched on a rock: Information is substrate-independent.

But information is also, mathematically speaking, extremely rare. The probability of finding a sequence encoding a sizable chunk of information by chance is so small that for practical purposes it is zero. For example, the probability that the information (not the exact sequence) of the HIV virus’s protease (a molecule that cuts proteins to size and is crucial for the virus’s self-replication) would arise by chance is less than 1 in 1096. There just aren’t enough particles in the universe (about 1080), and not enough time since the Big Bang, to try out all these different sequences. Of course, the information in the protease did not have to emerge by chance; it evolved. But before evolution, we have to rely on chance or assume that the information “fell from the sky” (an alternative hypothesis that assumes that life first occurred somewhere else and hitchhiked a ride on a meteorite to Earth).

It turns out that scientists have been able to construct self-replicating molecules (based on RNA enzymes) that encode just 84 bits of information, but even such a seemingly small piece of information is still extremely unlikely to emerge by chance (about one chance in 1024). Fortunately, information theory can tell us that there are some circumstances (particular environments) that can very substantially increase these probabilities, so a spontaneous emergence of life on this planet is by no means ruled out by these arguments.

Unfortunately, while given any particular environment we can estimate what the probability of spontaneous emergence might be, we have very little knowledge about the specifics of these environments on the ancient Earth. So while we can be more confident that spontaneous emergence is a possibility, the likelihood that the early Earth harbored just such environments is impossible to ascertain.

The chances that life emerged beyond Earth are at least as good as the chances it emerged here. Indeed, many meteorites that made it to Earth’s surface carry organic molecules with them, and information-theoretic considerations suggest that the environments they arose in are precisely those that are conducive to life.

Even though so many uncertainties about life and information remain, the information-theoretical analysis convincingly highlights the extraordinary power of life: While information is both enormously valuable and exceptionally rare, the simple act of copying (possibly with small modifications) can create information seemingly for free. So, from an information perspective, only the first step in life is difficult. The rest is just a matter of time.

Go Deeper
Author’s picks for further reading

arXiv: Information-theoretic considerations concerning the origins of life
In this pre-print, Chris Adami provides a more technical look at information and life.

Nature Reviews Genetics: Digital genetics: Unravelling the genetic basis of evolution
In this 2006 paper, Chris Adami reviews the emerging science of digital genetics. (Requires subscription.)

PLoS Biology: Bit by bit: the Darwinian basis of life
Gerald Joyce, who studies the origin of life and the role of RNA in Earth’s earliest life, discusses how information theory can be applied to astrobiology and “alternative life.”

Journal of Molecular Evolution: Monomer abundance distribution patterns as a universal biosignature: examples from terrestrial and digital life
In this paper, Chris Adami and his colleagues show how an “evolving digital life system” produces an analog for the chemical signatures of life. (Requires subscription.)

Editor’s picks for further reading

The Nature of Reality: Is Information Fundamental?
Discover why some theorists think that the fundamental “stuff” of the universe isn’t matter or energy, but information.

FQXi: It From Bit or Bit From It?
Read prize-winning essays from the Foundational Questions Institute’s 2013 essay contest on the theme of information and its role in reality.

Tell us what you think on Twitter, Facebook, or email.

Chris Adami

    Chris is a computational biologist with a focus on theoretical, experimental and computational Darwinian evolution, studying how biological systems evolve from the simplest molecules to the most complex structures such as the human brain. A professor of microbiology and molecular genetics as well as physics and astronomy at Michigan State University, he uses mathematics and computation to understand how simple rules can give rise to the most complex systems and behaviors. Chris blogs at Spherical Harmonics and tweets at @ChristophAdami about trying to understand how the universe works, including people and animals. And plants and microbes. So, pretty much everything.

    • MarquisDeMoo

      Thank you for so succinctly encapsulating my long held understanding of these issues. As a retired Communications Engineer I was familiar with Shannon, so it is nice to see it decompartmentalised in this way.

    • Frank DiMeglio

      This is as important as important gets in the discussion of the fundamentals of life, being, physics, and experience. This is what is sorely missing in this discussion.

      Dream experience is REAL/TRUE quantum gravity. Dreams involve fundamentally and ultimately equivalent and balanced gravity, inertia, and electromagnetism. Here is the clear and definitive proof:

      Dreams balance being AND experience. In dreams, we are conscious and alive in conjunction with the fundamental experience of our growth and becoming other than we are. Dreams involve visible AND invisible space in FUNDAMENTAL equilibrium and balance. Dreams involve physics/physical experience.

      Fundamentally and ultimately, the physics of dream experience is better understood than the physics of waking experience. Here’s why: Dreams make thought MORE LIKE sensory experience in general, thereby improving upon memory and understanding. Moreover, the ability of thought to DESCRIBE or RECONFIGURE sensory experience is ULTIMATELY dependent upon the extent to which thought is similar to sensory experience.

      Dreams are a fundamental and important experience. There is a physics of dreams.

      by Frank Martin DiMeglio

    • wally63

      No matter where LIFE started, it was CREATED by Yahweh, the one and only true living God. Make all the fancy conclusions and calculations you like, but Yahweh the Creator is the bottom line.

      • Arcanek

        The story of creation precedes Yahweh by a considerable amount of time. Look into Maurice Bigglio’s version. He was tasked by the Vatican to translate the old biblical texts. Yahweh was not the creator. More like a demiurge.

        • wally63

          I would say the name may have changed, but God the Creator Himself is WAY older than any Earthly account. Names take on changes such as Christ was named Yahshua but called Iesous by the Greeks which became Jesus hundreds of years later. Whatever name might be given, there is only one way we have such a rich, wonderful, huge, universe, and that’s by deliberate Creation, not by chance. That’s my belief and I was ONCE a staunch Evolutionist! Haha.

    • Raf

      I’m a little confused… You reference Shannon but I’m not sure that you are using his definition of information. For Shannon information doesn’t have anything to do with usefulness.

      To be explicity do you mean that information is the negative of the logarithm of a sum of probabilities or the logarithm of a sum of probabilities? Or something else?