What is entropy?
Entropy is simply a way to measure quantitatively what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources. Because entropy is an index of the second law’s predictions about energy, the short word entropy is often used interchangeably for the cumbersome phrase, “the second law of thermodynamics”. A concise summary of entropy’s nature is: Entropy change measures the dispersal of energy: how much energy is spread out in a particular process, or how widely spread out it becomes (at a specific temperature). You see now how hot pans cooling and chemical reactions belong to the how much’ catergory where energy is being transferred. Coffee in cream and gas expansion and perfume in air are how widely’ processes where the initial energy of the molecul
Some say entropy is a measure for chaos and others that it is a measure for the dispersion of energy (both are related to each other by “number of microstates”, which causes confusion). The present definition of entropy is identical with the formulation of the Second Law of Thermo, basically saying that energy always tends to disperse and when it does, the total entropy of system AND environment increases. This again has to do with reversible and irreversible processes, environments and systems and the confusion becomes total. Anyway, the dimension of entropy S is given by a relationship between energy Q and temperature T, as S = Q/T , thus in Joule/Kelvin (or Btu/Rankin – a medieval system of units, dating back to when the difference between mass and weight was not known, unfortunately still in use …
” was the only thing they were interested in. For that introductory Web page, I thought some practical examples like forest fires and rusting iron (or breaking surfboards and bones) would be a great introduction to the second law before talking about how it’s all measured by “entropy”. Wasnt that gradual approach OK? S: Yeh. I think I understand everything pretty well, but I didn’t want to take time to read any more in that secondlaw.com after what you called page six. What’s new about entropy that you’re going to talk about here that wasn’t back there? P: [[Just a sidenote to you who are reading this: Sometime, click on www.secondlaw.com/ten.html, i.e., the “Last Page” of secondlaw.com . It’s based on chemistry, but it’s not textbook stuff. It could profoundlychange your attitude about life — and the troubles that will hit you sooner or later.]] Now, back to entropy. Remember some people are logging on to this site without having read secondlaw.com. To bring them up to speed — if yo
Entropy is a measure of energy degradation; entropy increases as the quality of an energy source degrades. The entropy of a system can be received or given up from external sources or can be produced internally. The entropy received from an external source can be positive or negative. The internally produced entropy, which is also called internally generated entropy, must always be positive. Entropy received from an external source is essentially equal to heat received or given up divided by the absolute temperature at which the heat is received or given up. The yellow reflected solar radiation on the left of the energy budget figure of section 8.1 has the same wave length as the as the incoming solar radiation and there is no change in its entropy. The long wave infrared radiation on the right is emitted at a longer wave length than the incoming short wave solar radiation and therefore has more entropy than the incoming solar radiation.
Entropy, or more precisely “information entropy”, is the measure for randomness. An intuitive understanding of information entropy relates to the amount of uncertainty about picking a password, i.e. an object that could be translated in a string of bits. “If you have a 32-bit word that is completely random, then it has 32 bits of entropy. If the 32-bit word takes only four different values, and each values has a 25% chance of occurring, then then the word has 2 bits of entropy.” (Practical Cryptography, B. Schneier and N. Ferguson, p.