Why thermodynamic entropy is not disorder




















Likewise, cans of soup in the grocery store and files in a file cabinet are in order when each is resting in its proper place. In other words, order can be dynamic or static. Pop Quiz So what is entropy? Probably the most common answer you hear is that entropy is a kind of measure of disorder.

This is misleading. Equating entropy with disorder creates unnecessary confusion in evaluating the entropy of different systems. Consider the following comparisons. Which has more entropy? If you think of entropy as disorder, then the answers to these questions may trouble you.

Entropy According to Classical Thermodynamics 2 Let's take a look at where the idea of entropy actually came from. The concept of entropy originated around the mid 19th century, from the study of heat, temperature, work and energy, known as thermodynamics. This was the era of the steam locomotive. The study of how heat could be most efficiently converted to mechanical work was of prime interest.

It was understood that there was a relationship between heat and temperature. Generally speaking, the more heat you applied to an object, the hotter it got. It was also understood that heat and work represented different forms of energy and that under the right circumstances, you could convert one into the other.

Furthermore, it was observed that the only time heat would spontaneously flow out of one body was when it was in contact with another, colder, body. That is, heat always flowed from hot to cold. The challenge was to find the most efficient way to harness heat flowing out of a hot reservoir toward a cold reservoir and use it to do mechanical work. One of the difficulties was knowing how much heat energy was stored in the hot reservoir.

What was the maximum heat that you could theoretically withdraw from the reservoir? You couldn't measure the heat content directly. What you could measure was the reservoir's temperature.

If you knew the relationship between the temperature and the heat content for that reservoir, you could use the temperature to calculate the heat content.

Furthermore, if you used a temperature scale that decreased to zero as the heat content decreased to zero, then the relationship between temperature and heat content could be represented as a simple ratio. This became the operational definition of a newly conceived property of systems, a property which came to be know as entropy.

The term was coined in by Rudolf Clausius who thought of it as representing a kind of "internal work of transformation".

Simply stated, entropy is the relationship between the temperature of a body and its heat content more precisely, its kinetic heat energy. Entropy, S , is the heat content, Q , divided by the body's temperature, T.

The definition of entropy, as originally conceived in classical thermodynamics, had nothing to do with order or disorder. It had everything to do with how much heat energy was stored or trapped in a body at a given temperature. Think of it this way. If you removed all the heat energy possible from an object by cooling it down as far as possible down to absolute zero , and then kept track of the heat you had to put back into it to bring it back to a given state, that amount of heat supplied divided by the final temperature in kelvin would be the entropy of that object in that state.

The entropy of system is the average heat capacity of the system averaged over its absolute temperature. The Significance of Entropy in Classical Thermodynamics The significance of entropy in the study of heat engines and chemical reactions is that, for a given temperature, a system can hold only a certain amount of heat energy - no more and no less - depending on the entropy of the system.

If the entropy of the system changes, some energy will be released or absorbed in one form or another like a sponge that suddenly changes how much liquid it can hold.

For heat engines that meant that if you wanted to convert heat into mechanical work, you needed to make sure that more heat flowed out of the hot reservoir than could "fit" into the cold reservoir. You did this by not letting the cold reservoir heat up as heat flowed in and by not letting the hot reservoir cool down as heat flowed out.

As long as you maintained a temperature difference, more heat would flow out of the hot body than could be absorbed by, "fit into", the cold body. The surplus heat flow could be used to do mechanical work. In chemistry entropy meant that calculating the change in chemical energy, the energy represented by the making and breaking of chemical bonds, was not enough to predict how much useful energy would be released during a reaction.

The amount of energy "freed" by a reaction was the energy generated by the chemical reaction minus any additional energy trapped by changes in the system's entropy. The additional energy trapped was just the change in entropy, delta S , times the temperature of the system, T. In , J. Willard Gibbs named this useful energy released as "free energy" and provided the formula to calculate it.

The free energy, delta G , was the change in chemical energy, delta H , minus the trapped, thermal energy, T times delta S. With time, more was learned about the role of molecules in determining the classical thermodynamic variables such as pressure, temperature, and heat.

Pressure, it turned out, was just the total force exerted by individual molecules, colliding with themselves and the walls of the container, averaged over the surface area of the container. Temperature was determined to be the average kinetic energy of all the different ways the molecules could move, tumble or vibrate. This more detailed, molecular, perspective of thermodynamics and the mathematics associated with it became known as statistical thermodynamics.

The person most responsible for working out the mathematical relationship between entropy and molecular movement was Ludwig Boltzmann. From the molecular description of heat content and temperature, Boltzmann showed that entropy must represent the total number of different ways the molecules could move, tumble or vibrate. The idea was that heat was just kinetic energy on a scale that could not be observed directly but that manifested itself in the aggregate as the thermodynamic properties that could be observed.

Heat flowed from a hot body to a cold body as kinetic energy was transferred through molecular collisions occurring at the boundary between the two bodies and further distributed throughout the body as molecules collided with each other within the body. At each collision, kinetic energy was exchanged. On average, molecules with more kinetic energy lost kinetic energy as they collided and molecules with less kinetic gained kinetic energy as they collided, until, on average, the kinetic energy was optimally distributed among all the molecules and their various modes of movement.

The net result was that the more ways a system could move internally, the more molecular kinetic energy the system could hold for a given temperature. This was because temperature was just the average kinetic energy per mode of movement. You could think of these modes of movements as "pockets" that can hold kinetic energy. You could also think of them in more technical terms as molecular oscillators or modes of thermal oscillation.

If each pocket, on average, could hold the same amount of kinetic energy, then the more pockets a system had, the more total kinetic energy the system contained. The greater the number of kinetic energy pockets a system had, the greater its entropy.

So, on the molecular level, entropy was just a measure of the total number of molecular kinetic energy pockets contained in the system. Entropy As Disorder It was Boltzmann who advocated the idea that entropy was related to disorder. If we remove the constraint entirely, we now have access to all possible states. A major drawback to this example is that the books are unaffected by thermodynamic forces; they will not spontaneously change order as we change the constraints.

At first glance, a cooling of a cup of coffee violates the second law - the entropy is decreasing! The key point to resolving this apparent aberration is to look closer at what all three definitions refers to as a system. The coffee cools down and loses entropy because it is a part of a system involving the cup and air.

The heat energy from the coffee imperceptibly warms the air. The entropy of the coffee goes down while the entropy of the air increases and the entropy of the system of the coffee and air becomes greater than the entropy of either component before the coffee started to cool. If our coffee was a perfectly isolated system, it would remain the same temperature.

The second law of thermodynamics has no problem with pieces of a system losing entropy, only the whole system. Using the concept of entropy to argue against evolution may seem nonsensical initially because the two concepts describe very different processes and originate in very different fields Figure 1.

Yet the marriage of these two concepts is one of the most strongly-held, albeit pseudo-scientific, arguments that creationists and supporters of intelligent design commonly ascribe to. In this viewpoint, evolution is the drive towards more complexity, more order for example, Morris , Chick and the second law of thermodynamics drives systems to less complexity, less order. Thus their argument is that because the second law drives systems towards less order, evolution towards more complexity is falsified.

Baseless and factually incorrect arguments against evolution frequently rely upon evolution's lack of explanatory power with regard to the origin of life for example, Yahya This resistance towards evolutionary theory is misplaced. The theory of evolution via natural selection Darwin was never intended to and still does not address the question 'where does life come from'?.

Evolutionary theory also does not address the question 'why did life arise'?. The purpose of the theory of evolution by natural selection is solely to describe a mechanism by which organisms change and diversify as a function of time and selection. Instead, how and why life came about is more appropriately addressed by theories related to thermodynamics. Small simple molecules form larger more complex molecules because they are more energetically and entropically favorable Dill and Bromberg Models suggest that these same molecules self assembled into collections of molecules similar to drops of oil in water Gruber and Konig As these 'pockets' of lower entropy continued to collect they eventually developed into single-celled organisms.

These single-celled organisms can then take advantage of the energy from the sun to maintain order and life Nelson Only at this point can evolution take over to provide an explanation of how H. A common misconception of evolution is the assumption that, over time, organisms universally evolve from simplistic forms into more complex life forms for example, the single-celled organism becomes multicellular; monkeys gave way to apes which evolved into humans; Jakobi Yet, that Lamarckian view, that organisms only evolve towards more advanced forms with humans at the top of the evolutionary ladder, is one of the most common misconceptions about evolution Gould This means that, depending on the environmental context, successive generations of organisms can become less complex, less ordered, or that it may not change at all.

This lack of increase or even reduction in complexity can be seen in many examples both at the molecular and morphological levels. At the genomic level, many morphologically simple organisms, such as maize Zea mays and the water flea Daphnia pulex , either have larger genomes that is, a greater number of base pairs or a larger number of genes than morphologically more complex organisms including H. The fossil records of horseshoe crabs, crocodiles, and sharks are all examples of organisms that have maintained relative stasis with respect to their respective morphological characteristics over millions of years.

In contrast, natural selection has selected for fewer toes in horses for example, from five to one over approximately the past 50 million years Simpson Moreover, relative to their ancestors that were capable of flight, in ostriches selection has led to a reduction in the size of their wings, reduced the number of their toes from four to two, and the loss of the bony keel of the sternum that provided attachment points for flight muscles Pycraft These examples clearly demonstrate that evolution does not always result in an increase in complexity.

Or, put another way, stasis and 'simpler' forms arising over time are not in conflict with evolutionary theory contra Yahya Despite many examples that illustrate that natural selection optimizes the form of organisms to their environment via the loss of characters, some lineages have become increasingly more complex throughout their evolutionary history. Basically it is a question of scale, as the same lineage might be viewed as static or evolving towards increased or decreased complexity depending on the level of biological organization that is, genome, cellular, tissue or timeframe deep-time versus recent one examines.

Creationists might argue that this complexity also suggests more order, which would decrease the entropy of an organism and, therefore, violate the second law of thermodynamics for example, Morris , Yahya The resulting assertion is that because thermodynamics is so well accepted and understood, evolution must be wrong.

Yet this line of reasoning against evolution hinges on the common misconceptions of thermodynamics and entropy that we have outlined above. The argument is analogous to suggesting that a cooling cup of coffee violates the second law of thermodynamics. The key concept for our coffee is that it is not an isolated system; it is in contact with the air.

Similarly, the Earth is not an isolated system because it constantly radiates energy into space and receives energy from the sun. Likewise, no species is an isolated system. Individuals of a species interact with other members of their species' population, with other species and with the environment that is, well studied and well established ecological interactions. The second law of thermodynamics says that the entropy of a closed system reaches a maximum not that the individual pieces of a system will.

Likewise, energy is absorbed and expended by all living organisms, and like the cooling cup of coffee can alter the organism's environment that it shares with other individuals. As with a species, a single organism is also not a closed system. Entropy is an essential and fundamental idea of thermodynamics yet many people, scientists and non-scientists alike, have a major misunderstanding of the concept despite the actual definition of entropy being quite simple: it is the natural log of the number of microstates that describe the macrostate multiplied by Boltzmann's constant.

It is a counting problem along the lines of: how many ways can we place books? Entropy has been misunderstood and misinterpreted since Rudolf Clausius introduced the term. These misunderstandings and misinterpretations have just increased since Clausius's time. Currently, the most common misconceptions include equating disorder and entropy, believing it is possible to have negative entropy, and finally entropy's role in the second law of thermodynamics.

We have addressed each of these misconceptions in turn and hopefully shed a light on how they arose and how to address them in a classroom setting. From a biological perspective, clarifying the concept of entropy accomplishes two major goals. The first is to foster a correct and deeper understanding of the second law of thermodynamics, which plays a major role in all cellular systems.

The second goal is to address the misconceptions that underlie arguments against important concepts including evolution for example, Morris ; Chick ; Yahya Using entropy to argue against evolution carries its own problems because of the misconceptions associated with both entropy and evolution.

Yet the misunderstandings associated with both concepts present a teachable moment from which any classroom can emerge with a deeper insight into how the seemingly disparate disciplines of physics and biology are linked.

Google Scholar. Buckingham E: An outline of the theory of thermodynamics. New York, NY: Macmillian; Chick JT: Big Daddy? Ontario, CA: Chick Publishing; Clausius R: The mechanical theory of heat. London: John van Voorst; Nuremberg: Johannes Petreius; Darwin C: The Origin of Species. London: John Murray; Einstein A: Relativity, the special and the general theory.

Book Google Scholar. Feynman RP: Statistical mechanics: a set of lectures. Reading, MA: Addison-Wesley; New York: Cornell University; — Methods in Enzymology , 27— Gould SJ: Wonderful life, the burgess shale and the nature of history.

Gruber B, Konig B: Self-assembled vesicles with functionalized membranes. Chemistry , 19 2 — Hawking S: A brief history of time. New York, NY: Bantam; Hubble E: The realm of the nebulae. Jakobi SR: 'Little monkeys on the grass' how people for and against evolution fail to understand the theory of evolution.

Evolution: Education and Outreach , 3 3 — Klein JF: Physical significance of entropy or of the second law. Nelson PC: Biological physics: energy, information, life. Newton I: Philosophiae naturalis principia mathematica. London: S. Pepy's, Reg. Praeses; Peterson J: Understanding the thermodynamics of biological order. Am Biol Teach , 74 1 — Article Google Scholar. But so does the the two-phase gas, it has a certain symmetry and not others deriving from the underlying environmental process.

So far so good. Where is the " disorder " exactly, and with respect to what and to whom is this a " disorder "? Clearly there is a very subjective to mention the least concept of disorder used here which is not explained anywhere. Just stated as fact which is not. Some take this further equating entropy with death vs life which is even more absurd.

One can have a series of cages perfectly ordered, yet one will not have life in them. Please consider this before you just accept anything thrown at you sounding scientific while it is not. If you want the full scientific version of this answer check especialy the works of I.

Other schools of thermodynamics also have similar approaches and hard facts to consider. Refer to "What is the second law of thermodynamics and are there any limits to its validity?

In the scientific and engineering literature, the second law of thermodynamics is expressed in terms of the behavior of entropy in reversible and irreversible processes. According to the prevailing statistical mechanics interpretation the entropy is viewed as a nonphysical statistical attribute, a measure of either disorder in a system, or lack of information about the system, or erasure of information collected about the system, and a plethora of analytic expressions are proposed for the various measures.

Here entropy emerges as a microscopic nonstatistical property of matter. Entropy is one of the most basic facts and least understood, analysed related directly to causality, the arrow of time, quantum-mechanics and evolution. In fact most if not all time-reversible equations are wrong ot at least crude approximations rather than entropy and the time arrow itself. To quote the cosmologist Arthur Eddington :. The law that entropy always increases holds, I think, the supreme position among the laws of Nature.

If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation - well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.

Entropy is a tricky concept and hard to understand. When we hear that systems tend to increase entropy, we are saying there are dynamical laws driving them towards states of higher entropy.

But this comes from our knowledge that for simple systems with elementary microscopic behavior like ideal gases, or ideal liquids when comparing two states of equilibrium, the one with higher entropy is more stable. This might be misunderstood as an evidence that systems in general evolve by increasing entropy, which can be proven wrong.

In fact the universe evolves in such a way that instead of tending to be homogeneous, is highly organised galaxies, stars, planets, living beings. My approach to this would be twofold: first microscopic dynamics is not elementary, which means that molecules have more degrees of freedom than we conceive when we tend to think only in terms of entropy to predict the behavior of the system.

This is the same idea of Gibbs when he extended classical thermodynamics by allowing the number of molecules to change, which accounts for systems in which reactions may occur. But we can think of other types of "qualitative changes" as I like to call them , as did Terrell Hill in his conception of Thermodynamics of Small Systems.

Secondly, I think we should not forget that the dynamics of evolution of physical system are fundamentally different from what we expect by saying that systems tend to increase their entropy, this is simply not verified, and in my opinion, misleading.

A final note in saying that Temperature, as Entropy, refers to equilibrated states and is also wrongly believed to behave the way energy does. But this is not the case: the dynamics of systems does not depend on temperature, but on the relative energies of the involved parties. Microscopically speaking, the collision dynamics depends on the relative energies or momenta, rather than on their average.

Also in a non-equilibrated system, temperature understood as mean kinetic energy will largely fluctuate spatially before the whole system achieves the equilibrium. The entropy law can be comically reinterpreted like "equilibrium is a state of maximum possible disorder under given physical constraints".

Intuitively, large entropy means that things look more or less the same macroscopically for many different microscopic realizations. When the system evolves, it's statistically easy to find yourself in one of the many high-entropy states, but very rarely you can randomly stumble upon an ordered state. Imagine trying to shake a box of coins: what's the probability that you'll get all tails? The equilibrium state you keep shaking the box - simulation of thermal motion will be somewhere around half tails half heads, plusminus the standard deviation, typical for this system after binomial distribution.

In other comparison, parents all over the world know that the room only gets messier and reaches a state of chaos this being the equilibrium state. You must put in work to make it tidy again, and it doesn't stay that way for very long.

I'm giving a common sense illustration because the physics has already been covered by other posts. People keep saying entropy is a difficult concept to grasp, but that's only if you don't explain it right.

Terms are conventions. With a point of view from the humans we are the order. Collecting something and order it in shells is order. But I agree with you that to order something needs energy and this led to misorder and this could be a possible convention too. But it is not. Sign up to join this community. The best answers are voted up and rise to the top. Stack Overflow for Teams — Collaborate and share knowledge with a private group.

Create a free Team What is Teams? Learn more. Entropy is Ask Question. Asked 5 years, 9 months ago. Active 5 years, 9 months ago. Viewed 8k times. Improve this question. More entropy is more "more or equal variousness" than "more or equal disorder". At least to me it is. I've never been able to make any sense of it. I like your phrase. Show 6 more comments.



0コメント

  • 1000 / 1000