Skip to main content
  • Research article
  • Open access
  • Published:

Removing the entropy from the definition of entropy: clarifying the relationship between evolution, entropy, and the second law of thermodynamics

Abstract

Misinterpretations of entropy and conflation with additional misunderstandings of the second law of thermodynamics are ubiquitous among scientists and non-scientists alike and have been used by creationists as the basis of unfounded arguments against evolutionary theory. Entropy is not disorder or chaos or complexity or progress towards those states. Entropy is a metric, a measure of the number of different ways that a set of objects can be arranged. Herein, we review the history of the concept of entropy from its conception by Clausius in 1867 to its more recent application to macroevolutionary theory. We provide teachable examples of (correctly defined) entropy that are appropriate for high school or introductory college level courses in biology and evolution. Finally, we discuss the association of these traditionally physics-related concepts to evolution. Clarification of the interactions between entropy, the second law of thermodynamics, and evolution has the potential for immediate benefit to both students and teachers.

Background

It is perhaps apropos that the concept of entropy has continuously picked up misunderstandings and misinterpretations that have left the concept bloodied, beaten, and unrecognizable. Although not immediately obvious, the inaccurate characterization of entropy weighs heavily in current events involving education, especially in national and international debates involving the teaching of evolution. As with the concept of entropy, the theory of evolution by natural selection was originally proposed and has been refined primarily during the preceding two centuries (Figure 1). A common misinterpretation of the concept of entropy, that all systems progress towards more disorder, has been leveraged by those challenging evolutionary science and the teaching of evolution in schools by asserting that thermodynamics falsifies evolution (for example, Morris 2000, Ross 2004, Yahya 2005 (atlasofcreation.com)). Yet, much like the commonplace misinterpretation of Darwin's theory of natural selection as 'survival of the fittest', entropy is not 'a progression from order to disorder or chaos'. Rather, entropy is a measure of disorder. The conflation of its inaccurate interpretations with common misconceptions about evolution has resulted in a host of unfounded arguments that purport to challenge evidence for evolution and teaching evolution in classrooms.

Figure 1
figure 1

Timeline depicting major events in the development of the theories of entropy and evolution (all images used are part of the public domain).

Current misuse of entropy and thermodynamics in arguments against evolution present teachable moments to clarify the concept of entropy and related misconceptions about evolution that are often used hand-in-hand with misinterpretations of entropy. With this goal in mind, we present a historical review of the concept of entropy in the context of thermodynamics, provide an easily understood definition of entropy including examples that can be used in a classroom setting, and discuss how entropy actually relates to and supports the theory of evolution.

What is entropy? An historical perspective

Rudolf Clausius (1867) coined the term entropy from the Greek word Entropein for transformation and change (Greven et al. 2003). This transformative idea arose from studying the interaction of refrigeration and a heat engine and the transfer of the heat, Q, between the two. Clausius's conclusion was that heat changed while the quantity of the ratio of the heat over the temperature, T, remained the same. This conclusion led to the first definition that described the change of entropy (dS) as:

dS = dQ / T
(1)

which Clausius stated thusly (Klein 1910, p. 33):

Every bodily system possesses in every state a particular entropy, and this entropy designates the preference of nature for the state in question; in all the processes which occur in the system, entropy can only grow, never diminish.

Clausius's definition of entropy led directly to a mathematical statement of the second law of thermodynamics (Klein 1910):

dS dt ? dQ T .
(2)

In words, Equation 2 states that the change in entropy with time will always be greater or equal to the change of heat divided by the temperature, or put more simply using Equation 1, the change of entropy with time will always be greater or equal to the starting entropy. But the question still remained, what was this mysterious quantity known as entropy? Clausius's definition did not state what entropy was but only how entropy changed as a function of heat and temperature. It was clear that entropy had an innate tie with the degradation of usable energy into an unusable form. From an engineering perspective, this meant the generation of unrecoverable heat from any work done by an engine or person (Swinburne 1903).

That idea of entropy prompted intense re-examination of the ideas behind the concept of entropy by some of the most well-respected scientists of the 1800s, including James Clark Maxwell and John Strutt (a.k.a. Lord Rayleigh). Maxwell postulated that entropy had to be a distinct physical property of a body and must be zero when completely deprived of heat (Maxwell and Rayleigh 1904). This meant that as heat from one body was transferred to another, the entropy of the system of two bodies must increase. This idea persisted until most of the literature of the day stated: 'The entropy, like the energy, is, therefore, determinable only as regards its changes and not in absolute value' (Buckingham 1900, p. 111).

Eventually, the idea of entropy being just an inherent property of a system was replaced by a more detailed microscopic understanding championed by Josiah Willard Gibbs, Max Planck, and Ludwig Boltzmann. This gestalt shift in understanding did not happen easily and as Greven et al. put it 'Polemic was rife and good sense in abeyance.' (Greven et al. 2003, p. 24). This argument eventually resulted in the adoption of the following simple and elegant formula:

S = k B ln ?
(3)

known as Boltzmann's equation. It was later carved on Boltzmann's tombstone in the Zentralfriedhol (central cemetery) in Vienna as his final words on the subject to the scientific community (Dill and Bromberg 2003). The two important terms in the equation are Ω, the multiplicity and k B , Boltzmann's constant. Botlzmann's constant has the value of 1.38*10-23 JK-1 and connects the macroscopic thermodynamics of Clausius to the microscopic perspective of Boltzmann by allowing the conversion of temperature into units of energy (Dill and Bromberg 2003). The multiplicity, Ω, is the number of microstates that describe the macrostate. The multiplicity of a system is a quantitative measure of the ways of arranging a system. This made entropy a counting problem: how many ways can we distribute energy?

Teachable examples of entropy

The idea of multiplicity is at the crux of the definition of entropy. Despite Schrodinger's claim that a 'brief explanation in non-technical terms is well-nigh impossible' (Schrodinger 1944, p. 25), we shall endeavor to clarify this concept through two examples. For the first example, we shall examine a macroscopic system of 10 books that is not subjected to thermodynamic effects. With this intuition for multiplicity, we shall turn our attention to the thermodynamic system of a dye diffusing in solution. Both of these systems can be used to illuminate the idea of multiplicity, which has a direct relationship to entropy.

A macroscopic example

On a macroscopic scale, multiplicity can be thought of as the number of ways of arranging a set of items such as the 10 books shown in Figure 2. These 10 books were chosen by the authors to represent a selection of important books ranging from Darwin's (1859) On the Origin of Species; to Hawking's 1998 popular science book A Brief History of Time. Each book contains a set of properties: the title, the author, the broad subject, the year it was published, the number of pages, and so on. One of the natural ways to arrange these books on a shelf is alphabetically by author (Figure 2A). There is only one way to do this arrangement: Copernicus, Darwin, Einstein, Gould, Hawking, Hubble, Linnaeus, Newton, Schrodinger, and Simpson. Since there is only a single way to arrange these books alphabetically, it would have a multiplicity of one (Figure 2A).

Figure 2
figure 2

A macroscopic example of calculating the multiplicity of 10 science books. These 10 books consist of five biology books (Darwin 1859; Gould 1989; Von Linnaeus 1758; Schrodinger 1944; and Simpson 1951) and five physics books (Copernicus 1543; Einstein 1916; Hawking 1998; Hubble 1982; and Newton 1686). (A) Since there is only one way to arrange these 10 alphabetically, the multiplicity is one. (B) If the books are arranged by subject, the multiplicity increases because there are many ways to arrange the books when grouped by subject. (C) If the books are arranged by year, this changes the multiplicity because it applies a different constraint on the organization. (D) The maximum multiplicity is when there is no organization because any book can be in any position.

What if we do not care if the books are arranged alphabetically but specify that they must be arranged by subject? If each book can be broadly categorized as either a biology book or a physics book, our set includes an equal number of biology books (Darwin, Gould, Linneaus, Schrodinger, and Simpson) and physics books (Copernicus, Einstein, Hawking, Hubble, and Newton). The books in each set can be arranged in any order (for example, Copernicus, Einstein, Hawking, Hubble, and Newton; Copernicus, Newton, Einstein, Hubble, Hawking, and Hubble; Newton, Hubble, Hawking, Einstein, and Copernicus; and so on.). For the set of physics books, we can choose from five books as the first book, four books for the second, three for the third, two for the fourth, and one book for the last book, so we have 120 (5*4*3*2*1 = 5! = 120) different ways to arrange these five books. That is, this set of five books would have a multiplicity of 120. We can treat the five biology books in the same way with the same multiplicity. Since for every one of the 120 different ways to arrange the biology books there is still 120 different ways to arrange the physics books, the multiplicity of the two sets is giving as the product of the individual pieces, 120*120 = 14,400.

However, if we did not care how the books were arranged the multiplicity would be much higher (Figure 2D). Similar to arranging books by subject, the total number of ways is given by 10! or 3,628,800 because any book can be first, second, third, and so on. Within all these possible arrangements, there are some arrangements that would result in the books being grouped by subject leading to the question: What would the multiplicity be of this set of books if they were not ordered by subject? The solution is to calculate all possible arrangements (10!) and subtract off all the ways of arranging the books by subject with biology first (5!5!) and all the ways of arranging the books by subject with physics first (5!5!) for a result of 10!-2(5!5!) = 3,600,000. Likewise, the number of energetic microstates (entropy through Boltzmann's equation) is directly related to the number of ways of spreading energy (books) across molecules (spaces on the shelf) for different thermodynamic constraints.

A microscopic example

With the intuition of multiplicity we built from looking at the macroscopic example of books on a bookshelf, we can now turn our attention to a microscopic example of particles diffusing (Figure 3). Why does a drop of food coloring spread out (that is, diffuse) when introduced to a glass of water? The answer is because entropy is higher when the particles making up the dye are interspaced between the molecules of water.

Figure 3
figure 3

A microscopic example of the link between multiplicity and entropy consisting of the simplified system of diffusing dye molecules. (A) When the five molecules of dye (black circles) are introduced to the solution, they can only inhabit a very small area (grey circles). This limits how many ways the dye molecules can be arranged, resulting in a lower multiplicity and lower entropy. As time progresses, the dye molecules have access to more spaces in solution. (B) At time t = 1, the possible number of spaces the dye can inhabit is 25, resulting in a multiplicity of 53,130. (C) The next time step (t = 2), increases the accessible spaces even more. (D) Once the dye molecules have diffused and are able to inhabit all the spaces, the solution has reached a maximum multiplicity and entropy.

To visualize this system, let us make a few simplifying assumptions:

  1. 1.

    All molecules can be treated as simple spheres.

  2. 2.

    There are no interactions among molecules.

  3. 3.

    Molecules can only inhabit discrete x,y positions on a two-dimensional Cartesian plane.

  4. 4.

    Only one molecule can inhabit any given position at a given time.

When we first introduce the dye to our system, the molecules of the dye are located close together. In our simple system of 225 particles - 220 water molecules and five black dye molecules - this means the dye molecules are found in the middle 5 x 5 square of our system (Figure 3A). In this system, the dye molecules can only inhabit the positions located in grey for a given time. This makes the multiplicity when the dye is introduced (t = 0) the total number of ways of placing five dye molecules in nine different spaces. This can be calculated using the mathematics of combinatorics and results in a multiplicity of 126 different ways of placing the molecules.

At the next moment in time (t = 1), the dye molecules have the ability to access a larger area of our grid through diffusion. This increases the number of possible places for the dye from nine to 25 (Figure 3B), which results in more ways to arrange the dye molecules. As more time goes by, the dye molecules gain access to more sites: at t = 2, 49 sites (Figure 3C); at t = 3, 81 sites; at t = 4, 121 sites; until finally at t = ∞ the dye can access all of the possible 225 spaces on the grid (Figure 3D). With each increase in the possible spaces, the multiplicity, the number of ways, to arrange the dye molecules among the grid increases. This increase in multiplicity, just like the previous example using books, results in an increase of the entropy. Entropy does not cause the dye to diffuse, rather it just characterizes the number of ways of arranging the molecules.

Misinterpretations of entropy and the second law of thermodynamics

On the first page of Joseph Frederic Klein's book Physical Significance of Entropy or of the Second Law (1910, p. 1) he wrote:

'This term (Entropy) has long been known as a sort of property of the state of the body, has long been surmised to be of essentially a statistical nature, but with it all there was a sense that it was a sort of mathematical fiction, that it was somehow unreal and elusive, so it is no wonder that in certain engineering quarters it was dubbed the 'ghostly quantity'.

This was written in 1910 and despite ongoing improvements in communications, the idea of entropy is still misinterpreted throughout the literature.

Entropy as disorder

The popular consciousness and many biological authors have linked the ideas of entropy to disorder and chaos. Many authors use the concept of a messy room compared to a tidy room to explain disorder (Alberts et al. 2002; Peterson 2012). Although the concepts of entropy and disorder are inherently linked, disorder is only a metaphor for entropy, not the definition. For example, a tidy room is said to have low entropy while a messy room is said to have high entropy because it has more disorder. Yet we are never told what disorder is, which is critical in clarifying the concept of entropy.

Before cleaning up the ideas associated with a messy room, let us turn our attention to a random sequence of heads and tails that can be generated by flipping a coin 100 times. If we cannot discern any clear pattern or order in the sequence of flips, we would call the sequence disordered. If we did discern a pattern, say 50 heads followed by 50 tails, or an alternating sequence of heads and tails, we would call the sequence ordered. Yet every single individual sequence of coin flips has the same probability (1/2^100) and the same multiplicity (one), therefore, must have the same value of entropy. Looking closer at our coin example, we notice that sequences of heads and tails can have different numbers of heads and tails; one sequence may have 60 heads to 40 tails while another may be 50 head and 50 tails. We can calculate the probability of getting any ratio of heads to tails. There is only a single way to get all heads or all tails yet there are roughly 10^29 ways of getting an equal number of head and tails. Because the multiplicity is higher for an equal split of heads and tails (10^29 > 1), it has higher entropy.

Getting back to our poor example of a messy room, why does a messy or disordered room have higher entropy? Any configuration of items in the room will be unique and therefore have the same multiplicity. Using our coin example, the actual placement of items is less important than how we define ordered verse disordered. A tidy or ordered room is a room where the items in the room inhabit a small set of possible places - the books on the bookshelf, the clothes in the dresser, and so on - while a messy or disordered room is the set of all other configurations. The number of configurations we consider disordered is higher than the number of configurations we consider ordered. Thus a messy room does not have higher entropy because it is messy or disordered but rather because there are more configurations than an ordered or tidy room. That is, its multiplicity is higher.

Negative entropy

Another popular misconception is the idea of negative entropy. Entropy itself can never be negative but it is possible for entropy to decrease. This misunderstanding seems to arise from two sources; the historical development of entropy and Schrodinger using the idea to describe life. The later is a mathematical trick used to describe order while the former is a holdover of missteps along the development of thermodynamics and statistical physics.

Before Boltzmann and his micro-canonical understanding of thermodynamics, most scientists were concerned with the change in entropy. According to Maxwell, the zero point for entropy is defined when a body is totally deprived of heat (Maxwell and Rayleigh 1904). Planck stated that 'Entropy of a body in a given state, like the internal energy, is completely determined up to an additive constant, whose value depends on the zero state' (Planck 1903, p. 97). At the time it was convenient to set the zero point of entropy at a standard temperature and pressure (Maxwell and Rayleigh 1904) which allowed negative entropy because entropy could be decreased from this standard value.

After Boltzmann, this changed. The introduction of the multiplicity made temperature largely tangential and forced an absolute minimum value of zero for entropy. The only possible way to get a negative value for entropy would be to have a multiplicity of less than one. What does it mean to have 0.5 ways of arranging books on a bookshelf or 0.3 ways of arranging a sequence of coin flips? Multiplicity is a quantized quantity, which can only have values of positive integers and therefore entropy can only have values equal to or greater than zero.

The major thrust of Schrodinger's argument in his 1944 book What is life? is that organisms produce entropy by living and therefore must feed upon negative entropy to remain alive. Yet we have previously discussed that negative entropy is a nonsensical concept because multiplicity cannot have fractional values. Schrodinger clarifies the expression 'negative entropy' to that of 'entropy, taken with the negative sign'. This mathematical trick turns Boltzmann's equation into the following:

- S = - k ln ? = k ln 1 ? .
(4)

It is important to note that Ω has the same meaning as before, that of the multiplicity. But if we can relate Ω to disorder, what meaning does its inverse have? Well as the number of states increases, the value of 1/Ω decreases so if disorder is a metaphor for multiplicity, then the inverse can be thought of as a measure of order. This is Schrodinger's point and despite it being an intriguing concept, it remains an unnecessary mathematical trick.

Entropy and the second law of thermodynamics

The second law of thermodynamics is usually understood to be the tendency for the entropy of a system to increase. But like calling entropy disorder, this is an oversimplification. Richard Feynman describes the second law as the tendency for entropy to increase with time as a system moves to equilibrium (Feynman 1998). An alternative statement of the second law states that an isolated system at equilibrium has the macro-state that maximizes the entropy (Garcia et al. 2008, Garcia et al. 2011). The simplest description comes from Philip Nelson: the entropy of a system increases spontaneously when a constraint is removed (Nelson 2008). All of these definitions are correct and they all say the same thing.

To better understand this concept, let us revisit our example of books on a bookshelf. We know that there is only one way to arrange the books by publication date, but what happens when we start to loosen that constraint? If the books are constrained by the decade in which they were published, the number of arrangements goes up because there are more ways of rearranging the books. If we extend the constraint to the first 50 years and the last 50 years of a century (Figure 2C), the possible number of arrangements increases again (1!1!1!1!4!2! = 48). If we remove the constraint entirely, we now have access to all possible states. A major drawback to this example is that the books are unaffected by thermodynamic forces; they will not spontaneously change order as we change the constraints.

At first glance, a cooling of a cup of coffee violates the second law - the entropy is decreasing! The key point to resolving this apparent aberration is to look closer at what all three definitions refers to as a system. The coffee cools down and loses entropy because it is a part of a system involving the cup and air. The heat energy from the coffee imperceptibly warms the air. The entropy of the coffee goes down while the entropy of the air increases and the entropy of the system of the coffee and air becomes greater than the entropy of either component before the coffee started to cool. If our coffee was a perfectly isolated system, it would remain the same temperature. The second law of thermodynamics has no problem with pieces of a system losing entropy, only the whole system.

Entropy and evolution

Using the concept of entropy to argue against evolution may seem nonsensical initially because the two concepts describe very different processes and originate in very different fields (Figure 1). Yet the marriage of these two concepts is one of the most strongly-held, albeit pseudo-scientific, arguments that creationists and supporters of intelligent design commonly ascribe to. In this viewpoint, evolution is the drive towards more complexity, more order (for example, Morris 2000, Chick 2000) and the second law of thermodynamics drives systems to less complexity, less order. Thus their argument is that because the second law drives systems towards less order, evolution (towards more complexity) is falsified. However, both of these interpretations are patently incorrect and are couched in misunderstandings and misconceptions (that is, inherently biased conclusions always result from false assumptions and/or incorrect data).

Entropy and the origin of life

Baseless and factually incorrect arguments against evolution frequently rely upon evolution's lack of explanatory power with regard to the origin of life (for example, Yahya 2005). This resistance towards evolutionary theory is misplaced. The theory of evolution via natural selection (Darwin 1859) was never intended to and still does not address the question 'where does life come from'?. Evolutionary theory also does not address the question 'why did life arise'?. The purpose of the theory of evolution by natural selection is solely to describe a mechanism by which organisms change and diversify as a function of time and selection.

Instead, how and why life came about is more appropriately addressed by theories related to thermodynamics. Small simple molecules form larger more complex molecules because they are more energetically and entropically favorable (Dill and Bromberg 2003). Models suggest that these same molecules self assembled into collections of molecules similar to drops of oil in water (Gruber and Konig 2013). As these 'pockets' of lower entropy continued to collect they eventually developed into single-celled organisms. These single-celled organisms can then take advantage of the energy from the sun to maintain order and life (Nelson 2008). Only at this point can evolution take over to provide an explanation of how H. sapiens and millions of other species arose from such single-celled organisms.

Evolution as a simplifying force

A common misconception of evolution is the assumption that, over time, organisms universally evolve from simplistic forms into more complex life forms (for example, the single-celled organism becomes multicellular; monkeys gave way to apes which evolved into humans; Jakobi 2010). Yet, that Lamarckian view, that organisms only evolve towards more advanced forms with humans at the top of the evolutionary ladder, is one of the most common misconceptions about evolution (Gould 1989). Indeed, the theory of natural selection makes no statement about whether organisms should become more or less complex/advanced, but rather that an organism's form will be optimized by the environment by selectively favoring those individuals with characteristic that are associated with higher reproductive success (Darwin 1859). This means that, depending on the environmental context, successive generations of organisms can become less complex, less ordered, or that it may not change at all.

This lack of increase or even reduction in complexity can be seen in many examples both at the molecular and morphological levels. At the genomic level, many morphologically simple organisms, such as maize (Zea mays) and the water flea (Daphnia pulex), either have larger genomes (that is, a greater number of base pairs) or a larger number of genes than morphologically more complex organisms including H. sapiens (Gregory et al. 2006). The fossil records of horseshoe crabs, crocodiles, and sharks are all examples of organisms that have maintained relative stasis with respect to their respective morphological characteristics over millions of years. In contrast, natural selection has selected for fewer toes in horses (for example, from five to one) over approximately the past 50 million years (Simpson 1951). Moreover, relative to their ancestors that were capable of flight, in ostriches selection has led to a reduction in the size of their wings, reduced the number of their toes from four to two, and the loss of the bony keel of the sternum that provided attachment points for flight muscles (Pycraft 1900). These examples clearly demonstrate that evolution does not always result in an increase in complexity. Or, put another way, stasis and 'simpler' forms arising over time are not in conflict with evolutionary theory (contra Yahya 2005).

The application of entropy to evolution

Despite many examples that illustrate that natural selection optimizes the form of organisms to their environment via the loss of characters, some lineages have become increasingly more complex throughout their evolutionary history. Basically it is a question of scale, as the same lineage might be viewed as static or evolving towards increased or decreased complexity depending on the level of biological organization (that is, genome, cellular, tissue) or timeframe (deep-time versus recent) one examines. Creationists might argue that this complexity also suggests more order, which would decrease the entropy of an organism and, therefore, violate the second law of thermodynamics (for example, Morris 2000, Yahya 2005). The resulting assertion is that because thermodynamics is so well accepted and understood, evolution must be wrong.

Yet this line of reasoning against evolution hinges on the common misconceptions of thermodynamics and entropy that we have outlined above. The argument is analogous to suggesting that a cooling cup of coffee violates the second law of thermodynamics. The key concept for our coffee is that it is not an isolated system; it is in contact with the air. Similarly, the Earth is not an isolated system because it constantly radiates energy into space and receives energy from the sun. Likewise, no species is an isolated system. Individuals of a species interact with other members of their species' population, with other species and with the environment (that is, well studied and well established ecological interactions). The second law of thermodynamics says that the entropy of a closed system reaches a maximum not that the individual pieces of a system will. Likewise, energy is absorbed and expended by all living organisms, and like the cooling cup of coffee can alter the organism's environment that it shares with other individuals. As with a species, a single organism is also not a closed system.

Concluding remarks

Entropy is an essential and fundamental idea of thermodynamics yet many people, scientists and non-scientists alike, have a major misunderstanding of the concept despite the actual definition of entropy being quite simple: it is the natural log of the number of microstates that describe the macrostate multiplied by Boltzmann's constant. It is a counting problem along the lines of: how many ways can we place books?; how many ways can we arrange molecules?; and how many ways can we distribute the energy?

Entropy has been misunderstood and misinterpreted since Rudolf Clausius introduced the term. These misunderstandings and misinterpretations have just increased since Clausius's time. Currently, the most common misconceptions include equating disorder and entropy, believing it is possible to have negative entropy, and finally entropy's role in the second law of thermodynamics. We have addressed each of these misconceptions in turn and hopefully shed a light on how they arose and how to address them in a classroom setting.

From a biological perspective, clarifying the concept of entropy accomplishes two major goals. The first is to foster a correct and deeper understanding of the second law of thermodynamics, which plays a major role in all cellular systems. The second goal is to address the misconceptions that underlie arguments against important concepts including evolution (for example, Morris 2000; Chick 2000; Yahya 2005). Using entropy to argue against evolution carries its own problems because of the misconceptions associated with both entropy and evolution. Yet the misunderstandings associated with both concepts present a teachable moment from which any classroom can emerge with a deeper insight into how the seemingly disparate disciplines of physics and biology are linked.

References

  • Alberts B, Johnson L, Raff R, Walter P: Molecular Biology of the Cell. 4th edition. New York, NY: Garland Science; 2002.

    Google Scholar 

  • Buckingham E: An outline of the theory of thermodynamics. New York, NY: Macmillian; 1900.

    Google Scholar 

  • Chick JT: Big Daddy?. Ontario, CA: Chick Publishing; 2000.

    Google Scholar 

  • Clausius R: The mechanical theory of heat. London: John van Voorst; 1867.

    Google Scholar 

  • Copernicus N: De Revolutionibus Orbium Coelestium. Nuremberg: Johannes Petreius; 1543.

    Google Scholar 

  • Darwin C: The Origin of Species. London: John Murray; 1859.

    Google Scholar 

  • Dill KA, Bromberg S: Molecular driving forces: statistical thermodynamics in chemistry and biology. New York, NY: Garland Science; 2003.

    Google Scholar 

  • Einstein A: Relativity, the special and the general theory. London: Methuen & Co; 1916.

    Book  Google Scholar 

  • Feynman RP: Statistical mechanics: a set of lectures. Reading, MA: Addison-Wesley; 1998.

    Google Scholar 

  • Garcia HG, Kondev J, Orme N, Theriot JA, Phillips R: A first exposure to statistical mechanics for life scientists. New York: Cornell University; 2008:1–27. ArXiv:0708.1900v1 ArXiv:0708.1900v1

    Google Scholar 

  • Garcia HG, Kondev J, Orme N, Theriot JA, Phillips R: Thermodynamics of biological processes. Methods in Enzymology 2011, 492: 27–59.

    Article  PubMed Central  CAS  PubMed  Google Scholar 

  • Gould SJ: Wonderful life, the burgess shale and the nature of history. New York, NY: WW Norton and Company; 1989.

    Google Scholar 

  • Gregory TR, Nicol JA, Tamm H, Kullman B, Kullman K, Leitch IJ, Murray BG, Kapraun DF, Greilhuber J, Bennett MD: Eukaryotic genome size databases. Nucleic Acids Research 2006, 35(Database issue):D332-D338.

    PubMed Central  PubMed  Google Scholar 

  • Greven A, Keller G, Warnecke G: Entropy. Princeton, NJ: Princeton University Press; 2003.

    Book  Google Scholar 

  • Gruber B, Konig B: Self-assembled vesicles with functionalized membranes. Chemistry 2013, 19(2):438–448. 10.1002/chem.201202982

    Article  CAS  PubMed  Google Scholar 

  • Hawking S: A brief history of time. New York, NY: Bantam; 1998.

    Google Scholar 

  • Hubble E: The realm of the nebulae. New Haven, CT: Yale University Press; 1982.

    Google Scholar 

  • Jakobi SR: 'Little monkeys on the grass' how people for and against evolution fail to understand the theory of evolution. Evolution: Education and Outreach 2010, 3(3):416–419. 10.1007/s12052-010-0214-4

    Google Scholar 

  • Klein JF: Physical significance of entropy or of the second law. New York, NY: D Van Nostrand Company; 1910.

    Book  Google Scholar 

  • Maxwell JC, Rayleigh L: Theory of heat. London: Longmans, Green & Co.; 1904.

    Google Scholar 

  • Morris HM: The Long War Against God. Green Forest, AR: Master Books; 2000.

    Google Scholar 

  • Nelson PC: Biological physics: energy, information, life. New York, NY: WH Freeman and Co; 2008.

    Google Scholar 

  • Newton I: Philosophiae naturalis principia mathematica. London: S. Pepy's, Reg. Soc. Praeses; 1686.

    Google Scholar 

  • Peterson J: Understanding the thermodynamics of biological order. Am Biol Teach 2012, 74(1):22–24. 10.1525/abt.2012.74.1.6

    Article  Google Scholar 

  • Planck M: Treatise on thermodynamics. London: Longmans, Green & Co; 1903.

    Google Scholar 

  • Pycraft WP: On the morphology and phylogeny of the Paleognathae (Ratitae and Crypturi) and Neognathae (Carinatae). Transactions of the Zoological Society of London 1900, 15: 149–290.

    Article  Google Scholar 

  • Ross H: A matter of days. Colorado Springs, CO: NavPress; 2004.

    Google Scholar 

  • Schrodinger E: What is Life?. Cambridge: Cambridge University; 1944.

    Google Scholar 

  • Simpson GG: Horses. Oxford: Oxford University Press; 1951.

    Google Scholar 

  • Swinburne J: Entropy: or, thermodynamics form an engineer?s standpoint, and the reversibility of thermodynamics. New York, NY: Dutton; 1903.

    Google Scholar 

  • Von Linnaeus C Editio decima, reformata. In Systema naturae per regna tria naturae, secundum classes, ordines, genera, species, cum characteribus, differentiis, synonymis, locis. Holmiae: Laurentii Salvii; 1758.

    Google Scholar 

  • Yahya H: Atlas of creation volume I. Istanbul: Global Publishing; 2005.

    Google Scholar 

Download references

Acknowledgements

We thank E. Evarts and J. Weintraub for helpful suggestions and discussions in the preparation of this manuscript. This work was supported by the National Science Foundation through the National Evolutionary Synthesis Center (NESCent) under grant number NSF #EF-0905606.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joshua S Martin.

Authors’ original submitted files for images

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Martin, J.S., Smith, N.A. & Francis, C.D. Removing the entropy from the definition of entropy: clarifying the relationship between evolution, entropy, and the second law of thermodynamics. Evo Edu Outreach 6, 30 (2013). https://doi.org/10.1186/1936-6434-6-30

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/1936-6434-6-30

Keywords