One can quantify this relationship; Haldane (1957) and Kimura (1961) established that information can accumulate in a sexual population at a rate no higher than – log(1–s) bits per generation, where s is the selective load (basically, the fraction of the population lost to selection). Recent analyses of evolution in fluctuating environments (Bergstrom and Lachmann, 2004; Kussell and Leibler, 2005) further draw out the relation between theoretic measures of genomic information and the concept of Darwinian fitness. These analyses hint that the two different ways of measuring information—the Shannon framework and the decision theory framework—could be closely related under special circumstances. Kelly (1956) characterized one such set of circumstances; he established a relationship between the value of side information to a gambler and the entropy rate of the process being wagered upon. Evolution by natural selection appears to provide another example. However, further work is needed to approach a thorough understanding of these relations.
An attempt to characterize living systems by citing just two essential properties would probably include, first, that they are thermodynamically far from equilibrium, and second, that they store, accumulate, and transmit large amounts of information. While there is still a struggle to shape the concepts in ways that are rigorous and useful for biology, biologists can recognize that information is indeed a valuable way to describe many life processes. There are many nonbiological disciplines, including mathematics, computer science, and statistics, that have problems similar to some of those that biologists grapple with. The problem of understanding biological information and developing fruitful theoretical ideas and useful tools will likely be aided by this rich vein of ideas and methods.