Thermodynamics: Explaining Entropy for KiltedEngineer

  • Thread starter KiltedEngineer
  • Start date
  • Tags
    Entropy
In summary, entropy is a measure of the number of possible states a system can be in and is related to the probability distribution of the system's configurations. The concept of "disorder" is not entirely accurate and can be better understood by considering the relationship between microstates and macrostates. Entropy can also be affected by thermodynamic processes, such as heat transfer, and the second law of thermodynamics states that the combined entropy of a system can never decrease.
  • #1
KiltedEngineer
9
0
I am a mechanical engineering student, but have yet to take thermodynamics. For months now, I have been reading up on thermo and am actually quite interested in it. However, I am still quite confused with the concept of entropy. Its pretty much the consencus that the "disorder" explanation is not true, and it is the measure of energy that is wasted or unavailable to do useful work in a system. Can someone explain how this concept of disorder arises from the mthematically definition and if there is any truth to it?

Thank you!
KiltedEngineer
 
Engineering news on Phys.org
  • #2
One way to mathematically define entropy S is to relate it to the probability distribution describing a system: [tex]S=-k \sum_i^N p_i \log(p_i)[/tex]
The index i labels the different possible configurations for a system, of which there are N total, k is Boltzmann's constant, and pi is the probability that the system is in the state i. [The pi are all numbers from 0 to 1 so their logarithms monotonically increase from -∞ to 0, so the entropy is always a number greater than or equal to zero.]

If a system has only one possible state, then N=1 and p1=1, and thus S=0. Thus if we know exactly the configuration of a system, then it has zero entropy, and we can describe this state as "ordered." For example, if we have a crystal which is so cold that all the atoms sit exactly in their lattice sites, then we know exactly the state of the crystal, so its entropy is zero. The regular pattern of the location of the atoms is what we refer to as the crystal's order.

On the other hand, if the system could be in many different states, (i.e. N is big and there are many pi's which are non-negligable compared to the rest), then the entropy will be large. As an example, when we heat up a crystal, the atoms start wiggling in their lattice sites, so if we were to look at the crystal at various times, the atoms could be in many different possible positions. Thus it has a higher entropy than a cool crystal. There is somewhat less order if the atoms do not conform exactly to the lattice pattern.
 
Last edited:
  • #3
"Disorder" is really an imprecise word to describe entropy because it makes you define the word "order". We're saying that ordered means there are relatively few microstates that correspond to the macrostate. Disorder just means the opposite, that a larger number of microstates correspond to the macrostate.

An example of how microstates and macrostates are related is rolling a pair of dice. There are 11 macrostates of the system, 2-12, the sum of the numbers on the dice. There are 36 microstates, the combinations of ways the two dice can roll. There are 6 microstates that correspond to the macrostate "7", but 1 microstate that corresponds to the macrostate "2".

We can talk about the entropy of a substance, like a gas, without referring to a thermodynamic process. You can think of it like a property, like temperature or pressure. When we measure the temperature of a system, it's really a kind of average of the total energy. Excluding units for a while, let's say the energy of the system is 10 and we have 3 things in our system. All 3 things can have an energy of 10, making the average 10. Also, 1 of them can have energy 28 and the other 2 can have energy 1, making the average 10 again. There are a number of ways we can distribute the energy to the particles (microstates) still get the average (macrostate), just like in the dice example. The entropy is classically defined as the energy divided by the temperature.

Now, we talk about entropy as well when we refer to a thermodynamic process, like heat transfer. Heat is always transferred from a higher temperature to a lower temperature, and as the energy is transferred, the entropy of the "hotter" source is decreased and the entropy of the "colder" source increases. This happens because the number of states is tied to the temperature of the substance. The second law of thermodynamics states that the combined entropy of the total system ("hot" and "cold" sources combined) can never decrease as a result of the heat transfer. Since "disorder" is linked to the number of microstates that correspond to the macrostate, the "disorder" either stays the same or increases.
 

Related to Thermodynamics: Explaining Entropy for KiltedEngineer

1. What is thermodynamics?

Thermodynamics is the study of how energy is transferred and transformed within a system, and how this affects the physical properties of matter.

2. What is entropy?

Entropy is a measure of the disorder or randomness of a system. It is a fundamental concept in thermodynamics and is used to describe the tendency of systems to move towards equilibrium.

3. How is entropy related to thermodynamics?

In thermodynamics, entropy is closely related to the second law, which states that the total entropy of a closed system will always increase over time. This means that energy will naturally flow from areas of high concentration to areas of low concentration, leading to an increase in disorder.

4. Can you explain entropy in simple terms?

Entropy can be thought of as a measure of the amount of energy that is unavailable for work in a system. As energy is transferred and transformed, some of it becomes unusable, leading to an increase in entropy.

5. What is the role of entropy in the universe?

The second law of thermodynamics, which is based on the concept of entropy, has implications for the entire universe. It predicts that the universe will eventually reach a state of maximum entropy, where all energy is evenly distributed and no useful work can be done. This is known as the heat death of the universe.

Similar threads

  • Thermodynamics
Replies
3
Views
1K
Replies
12
Views
1K
Replies
1
Views
61
  • Thermodynamics
Replies
2
Views
804
  • Thermodynamics
Replies
1
Views
2K
  • Introductory Physics Homework Help
Replies
3
Views
747
  • Thermodynamics
Replies
2
Views
947
  • Science and Math Textbooks
Replies
10
Views
694
  • Materials and Chemical Engineering
Replies
3
Views
5K
Replies
1
Views
857
Back
Top