# OpenStax-CNX

You are here: Home » Content » College Physics » Statistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying Explanation
Content endorsed by: OpenStax College

• Preface

• #### 34. Frontiers of Physics

• 35. Atomic Masses
• 37. Useful Information
• 38. Glossary of Key Symbols and Notation

### Lenses

What is a lens?

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

#### Endorsed by (What does "Endorsed by" mean?)

This content has been endorsed by the organizations listed. Click each link for a list of all content endorsed by the organization.
• OpenStax College

This collection is included in aLens by: OpenStax College

Click the "OpenStax College" link to see all content they endorse.

#### Affiliated with (What does "Affiliated with" mean?)

This content is either by members of the organizations listed or about topics related to the organizations listed. Click each link to see a list of all content affiliated with the organization.
• Pierpont C & TC

This module is included inLens: Pierpont Community & Technical College's Lens
By: Pierpont Community & Technical CollegeAs a part of collection: "College Physics -- HLCA 1104"

Click the "Pierpont C & TC" link to see all content affiliated with them.

Click the tag icon to display tags associated with this content.

• Featured Content

This collection is included inLens: Connexions Featured Content
By: Connexions

"This introductory, algebra-based, two-semester college physics book is grounded with real-world examples, illustrations, and explanations to help students grasp key, fundamental physics concepts. […]"

Click the "Featured Content" link to see all content affiliated with them.

Click the tag icon to display tags associated with this content.

### Recently Viewed

This feature requires Javascript to be enabled.

### Tags

(What is a tag?)

These tags come from the endorsement, affiliation, and other lenses that include this content.

Order printed collection

Inside Collection (Textbook):

Textbook by: OpenStax College. E-mail the author

# Statistical Interpretation of Entropy and the Second Law of Thermodynamics: The Underlying Explanation

Module by: OpenStax College. E-mail the author

Summary:

• Identify probabilities in entropy.
• Analyze statistical probabilities in entropic systems.

The various ways of formulating the second law of thermodynamics tell what happens rather than why it happens. Why should heat transfer occur only from hot to cold? Why should energy become ever less available to do work? Why should the universe become increasingly disorderly? The answer is that it is a matter of overwhelming probability. Disorder is simply vastly more likely than order.

When you watch an emerging rain storm begin to wet the ground, you will notice that the drops fall in a disorganized manner both in time and in space. Some fall close together, some far apart, but they never fall in straight, orderly rows. It is not impossible for rain to fall in an orderly pattern, just highly unlikely, because there are many more disorderly ways than orderly ones. To illustrate this fact, we will examine some random processes, starting with coin tosses.

## Coin Tosses

What are the possible outcomes of tossing 5 coins? Each coin can land either heads or tails. On the large scale, we are concerned only with the total heads and tails and not with the order in which heads and tails appear. The following possibilities exist:

(1)

These are what we call macrostates. A macrostate is an overall property of a system. It does not specify the details of the system, such as the order in which heads and tails occur or which coins are heads or tails.

Using this nomenclature, a system of 5 coins has the 6 possible macrostates just listed. Some macrostates are more likely to occur than others. For instance, there is only one way to get 5 heads, but there are several ways to get 3 heads and 2 tails, making the latter macrostate more probable. Table 1 lists of all the ways in which 5 coins can be tossed, taking into account the order in which heads and tails occur. Each sequence is called a microstate—a detailed description of every element of a system.

Table 1: 5-Coin Toss
Individual microstates Number of microstates
5 heads, 0 tails HHHHH 1
4 heads, 1 tail HHHHT, HHHTH, HHTHH, HTHHH, THHHH 5
3 heads, 2 tails HTHTH, THTHH, HTHHT, THHTH, THHHT HTHTH, THTHH, HTHHT, THHTH, THHHT 10
2 heads, 3 tails TTTHH, TTHHT, THHTT, HHTTT, TTHTH, THTHT, HTHTT, THTTH, HTTHT, HTTTH 10
1 head, 4 tails TTTTH, TTTHT, TTHTT, THTTT, HTTTT 5
0 heads, 5 tails TTTTT 1
Total: 32

The macrostate of 3 heads and 2 tails can be achieved in 10 ways and is thus 10 times more probable than the one having 5 heads. Not surprisingly, it is equally probable to have the reverse, 2 heads and 3 tails. Similarly, it is equally probable to get 5 tails as it is to get 5 heads. Note that all of these conclusions are based on the crucial assumption that each microstate is equally probable. With coin tosses, this requires that the coins not be asymmetric in a way that favors one side over the other, as with loaded dice. With any system, the assumption that all microstates are equally probable must be valid, or the analysis will be erroneous.

The two most orderly possibilities are 5 heads or 5 tails. (They are more structured than the others.) They are also the least likely, only 2 out of 32 possibilities. The most disorderly possibilities are 3 heads and 2 tails and its reverse. (They are the least structured.) The most disorderly possibilities are also the most likely, with 20 out of 32 possibilities for the 3 heads and 2 tails and its reverse. If we start with an orderly array like 5 heads and toss the coins, it is very likely that we will get a less orderly array as a result, since 30 out of the 32 possibilities are less orderly. So even if you start with an orderly state, there is a strong tendency to go from order to disorder, from low entropy to high entropy. The reverse can happen, but it is unlikely.

Table 2: 100-Coin Toss
Macrostate Number of microstates
100 0 1
99 1 1 . 0 × 10 2 1 . 0 × 10 2 size 12{1 "." 0´"10" rSup { size 8{2} } } {}
95 5 7 . 5 × 10 7 7 . 5 × 10 7 size 12{7 "." 5´"10" rSup { size 8{7} } } {}
90 10 1 . 7 × 10 13 1 . 7 × 10 13 size 12{1 "." 7´"10" rSup { size 8{"13"} } } {}
75 25 2 . 4 × 10 23 2 . 4 × 10 23 size 12{2 "." 4´"10" rSup { size 8{"23"} } } {}
60 40 1 . 4 × 10 28 1 . 4 × 10 28 size 12{1 "." 4´"10" rSup { size 8{"28"} } } {}
55 45 6 . 1 × 10 28 6 . 1 × 10 28 size 12{6 "." 1´"10" rSup { size 8{"28"} } } {}
51 49 9 . 9 × 10 28 9 . 9 × 10 28 size 12{9 "." 9´"10" rSup { size 8{"28"} } } {}
50 50 1 . 0 × 10 29 1 . 0 × 10 29 size 12{1 "." 0´"10" rSup { size 8{"29"} } } {}
49 51 9 . 9 × 10 28 9 . 9 × 10 28 size 12{9 "." 9´"10" rSup { size 8{"28"} } } {}
45 55 6 . 1 × 10 28 6 . 1 × 10 28 size 12{6 "." 1´"10" rSup { size 8{"28"} } } {}
40 60 1 . 4 × 10 28 1 . 4 × 10 28 size 12{1 "." 4´"10" rSup { size 8{"28"} } } {}
25 75 2 . 4 × 10 23 2 . 4 × 10 23 size 12{2 "." 4´"10" rSup { size 8{"23"} } } {}
10 90 1 . 7 × 10 13 1 . 7 × 10 13 size 12{1 "." 7´"10" rSup { size 8{"13"} } } {}
5 95 7 . 5 × 10 7 7 . 5 × 10 7 size 12{7 "." 5´"10" rSup { size 8{7} } } {}
1 99 1 . 0 × 10 2 1 . 0 × 10 2 size 12{1 "." 0´"10" rSup { size 8{2} } } {}
0 100 1

Total: 1.27×10301.27×1030 size 12{1 "." "27"´"10" rSup { size 8{"30"} } } {}

## Disorder in a Gas

The fantastic growth in the odds favoring disorder that we see in going from 5 to 100 coins continues as the number of entities in the system increases. Let us now imagine applying this approach to perhaps a small sample of gas. Because counting microstates and macrostates involves statistics, this is called statistical analysis. The macrostates of a gas correspond to its macroscopic properties, such as volume, temperature, and pressure; and its microstates correspond to the detailed description of the positions and velocities of its atoms. Even a small amount of gas has a huge number of atoms: 1.0 cm31.0 cm3 size 12{1 "." 0" cm" rSup { size 8{3} } } {} of an ideal gas at 1.0 atm and 0º C0º C size 12{0°C} {} has 2.7×10192.7×1019 size 12{2 "." 7 times "10" rSup { size 8{"19"} } } {} atoms. So each macrostate has an immense number of microstates. In plain language, this means that there are an immense number of ways in which the atoms in a gas can be arranged, while still having the same pressure, temperature, and so on.

The most likely conditions (or macrostates) for a gas are those we see all the time—a random distribution of atoms in space with a Maxwell-Boltzmann distribution of speeds in random directions, as predicted by kinetic theory. This is the most disorderly and least structured condition we can imagine. In contrast, one type of very orderly and structured macrostate has all of the atoms in one corner of a container with identical velocities. There are very few ways to accomplish this (very few microstates corresponding to it), and so it is exceedingly unlikely ever to occur. (See Figure 2(b).) Indeed, it is so unlikely that we have a law saying that it is impossible, which has never been observed to be violated—the second law of thermodynamics.

The disordered condition is one of high entropy, and the ordered one has low entropy. With a transfer of energy from another system, we could force all of the atoms into one corner and have a local decrease in entropy, but at the cost of an overall increase in entropy of the universe. If the atoms start out in one corner, they will quickly disperse and become uniformly distributed and will never return to the orderly original state (Figure 2(b)). Entropy will increase. With such a large sample of atoms, it is possible—but unimaginably unlikely—for entropy to decrease. Disorder is vastly more likely than order.

The arguments that disorder and high entropy are the most probable states are quite convincing. The great Austrian physicist Ludwig Boltzmann (1844–1906)—who, along with Maxwell, made so many contributions to kinetic theory—proved that the entropy of a system in a given state (a macrostate) can be written as

S=k lnW,S=k lnW, size 12{S=k" ln"W} {}
(2)

where k=1.38×1023J/Kk=1.38×1023J/K size 12{k=1 "." "38" times "10" rSup { size 8{ - "23"} } "J/K"} {} is Boltzmann’s constant, and lnWlnW size 12{"ln" W} {} is the natural logarithm of the number of microstates WW size 12{ W} {} corresponding to the given macrostate. WW size 12{ W} {} is proportional to the probability that the macrostate will occur. Thus entropy is directly related to the probability of a state—the more likely the state, the greater its entropy. Boltzmann proved that this expression for SS size 12{ S} {} is equivalent to the definition ΔS=Q/TΔS=Q/T size 12{ ΔS=Q/T} {}, which we have used extensively.

Thus the second law of thermodynamics is explained on a very basic level: entropy either remains the same or increases in every process. This phenomenon is due to the extraordinarily small probability of a decrease, based on the extraordinarily larger number of microstates in systems with greater entropy. Entropy can decrease, but for any macroscopic system, this outcome is so unlikely that it will never be observed.

## Example 1: Entropy Increases in a Coin Toss

Suppose you toss 100 coins starting with 60 heads and 40 tails, and you get the most likely result, 50 heads and 50 tails. What is the change in entropy?

Strategy

Noting that the number of microstates is labeled WW size 12{W} {} in Table 2 for the 100-coin toss, we can use ΔS=SfSi=k lnWf-klnWiΔS=SfSi=k lnWf-klnWi size 12{DS=S rSub { size 8{f} } -S rSub { size 8{i} } =k" ln"W rSub { size 8{f} } +- k"ln"W rSub { size 8{i} } } {} to calculate the change in entropy.

Solution

The change in entropy is

ΔS=SfSi=k lnWfklnWi,ΔS=SfSi=k lnWfklnWi, size 12{DS=S rSub { size 8{f} } -S rSub { size 8{i} } =k" ln"W rSub { size 8{f} } +- k"ln"W rSub { size 8{i} } } {}
(3)

where the subscript i stands for the initial 60 heads and 40 tails state, and the subscript f for the final 50 heads and 50 tails state. Substituting the values for WW size 12{W} {} from Table 2 gives

ΔS = ( 1 . 38 × 10 23 J/K ) [ ln ( 1 . 0 × 10 29 ) ln ( 1 . 4 × 10 28 ) ] = 2.7 × 10 23 J/K ΔS = ( 1 . 38 × 10 23 J/K ) [ ln ( 1 . 0 × 10 29 ) ln ( 1 . 4 × 10 28 ) ] = 2.7 × 10 23 J/K alignl { stack { size 12{DS= $$1 "." "38"´"10" rSup { size 8{-"23"} } " J/K"$$ $"ln" $$1 "." 0´"10" rSup { size 8{"29"} }$$ +- "ln " $$1 "." 4´"10" rSup { size 8{"28"} }$$$ } {} # " =2" "." 7´"10" rSup { size 8{ +- "23"} } " J/K" {} } } {}
(4)

Discussion

This increase in entropy means we have moved to a less orderly situation. It is not impossible for further tosses to produce the initial state of 60 heads and 40 tails, but it is less likely. There is about a 1 in 90 chance for that decrease in entropy (2.7×1023 J/K2.7×1023 J/K size 12{ +- 2 "." 7´"10" rSup { size 8{ +- "23"} } " J/K"} {}) to occur. If we calculate the decrease in entropy to move to the most orderly state, we get ΔS=92×1023 J/KΔS=92×1023 J/K size 12{DS= +- "92"´"10" rSup { size 8{ +- "23"} } " J/K"} {}. There is about a 1 in 10301 in 1030 size 12{1" in ""10" rSup { size 8{"30"} } } {} chance of this change occurring. So while very small decreases in entropy are unlikely, slightly greater decreases are impossibly unlikely. These probabilities imply, again, that for a macroscopic system, a decrease in entropy is impossible. For example, for heat transfer to occur spontaneously from 1.00 kg of CC size 12{0ºC} {} ice to its CC size 12{0°C} {} environment, there would be a decrease in entropy of 1.22×103 J/K1.22×103 J/K size 12{1 "." "22" times "10" rSup { size 8{3} } " J/K"} {}. Given that a ΔS of 1021 J/KΔS of 1021 J/K size 12{DS" of 10" rSup { size 8{ +- "21"} } " J/K"} {} corresponds to about a 1 in 10301 in 1030 chance, a decrease of this size (103 J/K103 J/K size 12{"10" rSup { size 8{3} } " J/K"} {}) is an utter impossibility. Even for a milligram of melted ice to spontaneously refreeze is impossible.

## Problem-Solving Strategies for Entropy:

1. Examine the situation to determine if entropy is involved.
2. Identify the system of interest and draw a labeled diagram of the system showing energy flow.
3. Identify exactly what needs to be determined in the problem (identify the unknowns). A written list is useful.
4. Make a list of what is given or can be inferred from the problem as stated (identify the knowns). You must carefully identify the heat transfer, if any, and the temperature at which the process takes place. It is also important to identify the initial and final states.
5. Solve the appropriate equation for the quantity to be determined (the unknown). Note that the change in entropy can be determined between any states by calculating it for a reversible process.
6. Substitute the known value along with their units into the appropriate equation, and obtain numerical solutions complete with units.
7. To see if it is reasonable: Does it make sense? For example, total entropy should increase for any real process or be constant for a reversible process. Disordered states should be more probable and have greater entropy than ordered states.

## Section Summary

• Disorder is far more likely than order, which can be seen statistically.
• The entropy of a system in a given state (a macrostate) can be written as
S=klnW,S=klnW,
(5)
where k=1.38×10–23J/Kk=1.38×10–23J/K is Boltzmann’s constant, and lnWlnW is the natural logarithm of the number of microstates WW corresponding to the given macrostate.

## Conceptual Questions

### Exercise 1

Explain why a building made of bricks has smaller entropy than the same bricks in a disorganized pile. Do this by considering the number of ways that each could be formed (the number of microstates in each macrostate).

## Problem Exercises

### Exercise 1

Using Table 2, verify the contention that if you toss 100 coins each second, you can expect to get 100 heads or 100 tails once in 2×10222×1022 size 12{2´"10" rSup { size 8{"22"} } } {} years; calculate the time to two-digit accuracy.

#### Solution

It should happen twice in every 1.27 × 1030 s 1.27 × 1030 s or once in every 6.35× 1029 s6.35× 1029 s (6.35 × 10 29 s) (1 h 3600 s) (1 d24 h) (1 y365.25 d) = 2.0× 1022 y (6.35 × 10 29 s) (1 h 3600 s) (1 d24 h) (1 y365.25 d) = 2.0× 1022 y

### Exercise 2

What percent of the time will you get something in the range from 60 heads and 40 tails through 40 heads and 60 tails when tossing 100 coins? The total number of microstates in that range is 1.22×10301.22×1030 size 12{1 "." "22"´"10" rSup { size 8{"30"} } } {}. (Consult Table 2.)

### Exercise 3

(a) If tossing 100 coins, how many ways (microstates) are there to get the three most likely macrostates of 49 heads and 51 tails, 50 heads and 50 tails, and 51 heads and 49 tails? (b) What percent of the total possibilities is this? (Consult Table 2.)

#### Solution

(a) 3.0×10293.0×1029 size 12{3 "." 0 times "10" rSup { size 8{"29"} } } {}

(b) 24%

### Exercise 4

(a) What is the change in entropy if you start with 100 coins in the 45 heads and 55 tails macrostate, toss them, and get 51 heads and 49 tails? (b) What if you get 75 heads and 25 tails? (c) How much more likely is 51 heads and 49 tails than 75 heads and 25 tails? (d) Does either outcome violate the second law of thermodynamics?

### Exercise 5

(a) What is the change in entropy if you start with 10 coins in the 5 heads and 5 tails macrostate, toss them, and get 2 heads and 8 tails? (b) How much more likely is 5 heads and 5 tails than 2 heads and 8 tails? (Take the ratio of the number of microstates to find out.) (c) If you were betting on 2 heads and 8 tails would you accept odds of 252 to 45? Explain why or why not.

#### Solution

(a) -2.38×1023 J/K-2.38×1023 J/K size 12{ +- 2 "." "38"´"10" rSup { size 8{ +- "23"} } " J/K"} {}

(b) 5.6 times more likely

(c) If you were betting on two heads and 8 tails, the odds of breaking even are 252 to 45, so on average you would break even. So, no, you wouldn’t bet on odds of 252 to 45.

Table 3: 10-Coin Toss
Macrostate Number of Microstates (W)
10 0 1
9 1 10
8 2 45
7 3 120
6 4 210
5 5 252
4 6 210
3 7 120
2 8 45
1 9 10
0 10 1
Total: 1024

### Exercise 6

(a) If you toss 10 coins, what percent of the time will you get the three most likely macrostates (6 heads and 4 tails, 5 heads and 5 tails, 4 heads and 6 tails)? (b) You can realistically toss 10 coins and count the number of heads and tails about twice a minute. At that rate, how long will it take on average to get either 10 heads and 0 tails or 0 heads and 10 tails?

### Exercise 7

(a) Construct a table showing the macrostates and all of the individual microstates for tossing 6 coins. (Use Table 3 as a guide.) (b) How many macrostates are there? (c) What is the total number of microstates? (d) What percent chance is there of tossing 5 heads and 1 tail? (e) How much more likely are you to toss 3 heads and 3 tails than 5 heads and 1 tail? (Take the ratio of the number of microstates to find out.)

#### Solution

(b) 7

(c) 64

(d) 9.38%

(e) 3.33 times more likely (20 to 6)

### Exercise 8

In an air conditioner, 12.65 MJ of heat transfer occurs from a cold environment in 1.00 h. (a) What mass of ice melting would involve the same heat transfer? (b) How many hours of operation would be equivalent to melting 900 kg of ice? (c) If ice costs 20 cents per kg, do you think the air conditioner could be operated more cheaply than by simply using ice? Describe in detail how you evaluate the relative costs.

## Glossary

macrostate:
an overall property of a system
microstate:
each sequence within a larger macrostate
statistical analysis:
using statistics to examine data, such as counting microstates and macrostates

## Content actions

### Give feedback:

#### Collection to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks

#### Module to:

My Favorites (?)

'My Favorites' is a special kind of lens which you can use to bookmark modules and collections. 'My Favorites' can only be seen by you, and collections saved in 'My Favorites' can remember the last module you were on. You need an account to use 'My Favorites'.

| A lens I own (?)

#### Definition of a lens

##### Lenses

A lens is a custom view of the content in the repository. You can think of it as a fancy kind of list that will let you see content through the eyes of organizations and people you trust.

##### What is in a lens?

Lens makers point to materials (modules and collections), creating a guide that includes their own comments and descriptive tags about the content.

##### Who can create a lens?

Any individual member, a community, or a respected organization.

##### What are tags?

Tags are descriptors added by lens makers to help label content, attaching a vocabulary that is meaningful in the context of the lens.

| External bookmarks