Tag Archives: thermodynamics

Entropy and Life

Entropy is a measure of how many configurations could yield the same macrostate, and thus how probable the macrostate is. It can be a measure of information, or a measure of disorder in a physical system. But what about the entropy of biological systems?

Relatively few configurations yield life, compared to the many that don’t. Life is highly ordered, so living organisms should have much lower entropy than their non-living constituents. In fact, using energy to create and maintain order is one of the key signatures of life! One implication of life having low entropy is that life is improbable, which so far seems to be true based on the limited observations we have of other planets. But another implication is that living things act to reduce entropy locally, in the organism, there must be a corresponding increase in entropy somewhere else to offset that reduction. This is required because of the second law of thermodynamics, which says that by far the most statistically probable outcome is an entropy increase. But the second law applies to ‘closed systems’, which means a system that cannot exchange heat or energy with its surroundings. An organism that can interact with its surroundings can expel entropy via heat, to gain local order and reduce local entropy. Global disorder still increases, but for that organism, the ability to locally reduce entropy is literally a matter of life and death.

An obvious example of this principle is humans. Our human bodies are very highly ordered compared to inanimate things like air and water. Even compared to dirt, which has a whole ecosystem of microbiota and larger organisms like worms, a human represents many times more order. But this doesn’t contradict the second law, because the way we maintain life is to take in food and expel very disordered waste products. Humans can extract the chemical energy in the food and use it to maintain or decrease local entropy levels, and thus stay alive. Obviously other animals do this too, though they may eat different things than we do or digest them in different ways. And actually, the plants that we and other animals eat have done something similar, except that instead of getting chemical energy from combustion they are able to extract it directly from the sun’s light. Plants maintain their low entropy by releasing heat and high-entropy waste products, and anything that eats plants (or that eats something that eats plants) is converting solar energy into local order as well as expelled heat.

If you’re really feeling clever, you might ask, what about the planet Earth? If we’re receiving all this sunlight, and making life from it, shouldn’t there be a corresponding buildup of entropy on the planet, in the form of waste heat or some other disordering? Isn’t the Earth a closed system, isolated in space, whose order is constantly increasing?

But if a closed system is one which exchanges no heat with its surroundings, then the Earth doesn’t qualify because it is obviously exchanging heat with the Sun! The Earth receives a massive number of photons from the sun, which is where plants, and by extension the rest of us, get energy to create order. But in addition, the Earth is also radiating energy and heat into space, as all objects do. The incident energy from the sun is directional, high-energy, and highly ordered, but the energy the Earth radiates into space is in all directions, low-energy, and very disordered. That’s where the excess entropy is going!

Thus, life on our dear planet is not a violation of the second law of thermodynamics at all, because living organisms and even huge ecosystems are not closed systems. What’s more, the creation of order from chaos actually requires a net increase in entropy: it requires a reconfiguration of atoms and microstates, and the most likely outcome of any reconfiguration is an increase in entropy.  Many of chemical reactions necessary for life are entropically driven, where the outcome has many more available states than the inputs so the reaction is statistically favored to occur. Organisms that do work to create order must also create entropy, and the organisms most likely to survive are often those with the most clever control of entropy generation. So the proliferation of life is not threatened by entropy, as in the popular conception, but actually depends on entropy generation!

Advertisements

What is Entropy?

Last time we talked about information-entropy, which is a way of quantifying the information in a given string of symbols. Each symbol has a certain uncertainty associated with it, which may be low if the symbol can be one of two things, or high if the symbol can be one of a hundred things. Summing all the possible outcomes gives the entropy, which is larger for more uncertain situations and smaller for more certain situations. So the more you stand to learn from the string, the higher its uncertainty and entropy, and the larger its information content.

But let’s get less abstract and think about examples. Say that we flip a coin and see if it lands with the head or tail side up. An evenly weighted coin will have an equal probability of being heads or tails. But perhaps an uneven coin is more likely to be heads, or maybe gnomes have swapped in a coin that’s heads on both sides. The results of the heads-only coin toss have very low entropy because there is only one possible result, and minimal information is gained by examining the result. Whereas the weighted coin will have higher entropy due to the larger number of results possible. And the evenly weighted coin not only has more results possible than the heads-only coin, those results are also maximally unpredictable, so the entropy is the highest. And if we do a series of coin tosses, we get a string of coin toss results. A longer string will have more entropy than a shorter string, except in the case where the coin is heads-only. Now if we replace the binary coin toss with choosing a letter from the alphabet, which has 26 possibilities instead of two, we have significantly increased the information-entropy!

Of course, sometimes it is useful to impose some rules on a string of symbols, for example the rules associated with a specific language. Doing so will reduce the uncertainty, and thus the entropy and information content, of the string. This is another way of saying that a string of letters that spells out words in English has a lower entropy than a string of random letters, because in English you know that not all the letters are equally probable, one letter affects the probability of letters following it, and other things like that. It’s the equivalent of weighting the coin! In fact, the trick of data compression is to reduce the number of symbols used in a string without reducing the entropy (and thus the information content) of the message. Data compression is not possible when each symbol in a message is maximally surprising, which explains the difficulty of compressing things like white noise.

Now, what if instead of a sequence of coin tosses or a string of letters, you instead had a collection of atoms that could be in different states? Consider a box filled with a gas, where each atom of the gas can be described by its position in the box and its momentum. The entropy of any given configuration of atoms would then be the sum of all the possible states for each atom, the same way the entropy of a string was the sum of possible symbols in the string (weighted for probability). Entropy is still a measure of uncertainty, but in this physical example the question is how many arrangements of atoms in specific states can make a configuration that has the same measurable properties, such as pressure, temperature, and volume. For example, if the gas is evenly distributed throughout the box, we can make a wide variety of changes to the individual atom positions and velocities without changing the measurable properties of the gas. Thus the entropy is high because of the large number of atomic arrangements that could yield the same result, which means there is a high uncertainty in what any individual atom is doing. In contrast, if the gas atoms are confined to a very small region of the box, there are fewer positions and momenta available to the atoms, and thus a smaller number of indistinguishable arrangements. So the entropy is lower, because there are fewer ways to have the same number of atoms all in a corner of the box.

The technical way to describe this formulation of entropy is that each atom has a number of microstates available to it, and all the atoms together have measurable properties (pressure, volume, temperature, etc.) that define the macrostate. The entropy of any given macrostate is equal to the number of microstate configurations that could produce that macrostate, which means it’s still about uncertainty. But you can also see that entropy is a form of state-counting: higher entropy macrostates can be attained in a larger number of ways than lower entropy macrostates. This means that in general, higher entropy states are more probable. If there is one way to pack all the atoms into the corner of a box, but there are a million ways to evenly distribute the atoms in the box, then the chances of just finding the atoms in the corner are one in a million. And since those atoms are constantly moving and exploring new microstates, over time they will tend to the highest entropy macrostates. This is where the Second Law of Thermodynamics comes from, which says that in any isolated system, total entropy increases over time toward a maximum value.

The idea of entropy as state-counting came from Ludwig Boltzmann, more than fifty years before information theory was developed. Shannon called his measure information-entropy because of the resemblance to entropy as defined in collections of atoms, which is the basis of statistical mechanics. Entropy is a measure of information and uncertainty, but also a way to count the number of states, and a measure of the relative ordering of a system.