Hey guys. In this video, we're going to introduce the concept of a macrostate and a microstate of a system and how it pertains to the entropy of the system. Alright, let's get to it. Right so far, whenever we've been talking about the state of a system, we've been referring to simply the state. We've just been saying the word state. That state that we've been talking about is more properly referred to as a macrostate. Macro meaning macroscopic. This means pressure, volume, temperature, internal energy, entropy, etc. And any of these macroscopic measurements that you can make on a system define its macrostates. The macrostates are defined by its measurable thermodynamic properties, right, just like I listed. For ideal gases, the pressure and the volume, and therefore the temperature because they are related by the ideal gas law, define a system's macrostate. That is a typo or an autocorrect in word that appeared. Something important to realize is that a particular macrostate is not actually unique to a system. A system can rearrange itself to make the same macrostate multiple times. There can be multiple arrangements of the system internally to make a particular macrostate. It can have many different microscopic arrangements that produce the same macrostate. For instance, two samples of a gas can have the same temperature, but they can have different positions of the gas particles that make up that gas, leading to the same temperature. That's a very easy one. You can think about all the different positions. In fact, the gas particles are constantly changing position even if the gas is in thermal equilibrium. A system can typically have multiple microstates, but it has to be at least one. At least one microscopic arrangement has to exist for a macrostate. Otherwise, what's the point of even considering that a macrostate because it's not possible for the system to arrange itself in that manner. The number of microstates for a particular macrostate is called its multiplicity which is given by the capital Greek letter Omega. So, a multiplicity is particular to a macrostate. So you could have a macrostate of 1, a macrostate of 2, a macrostate of 3, a macrostate of 4, and they could all have different multiplicities, different numbers of macro states available to them. Let's do an example. Consider four coins. A particular macrostate for the system could be two coins heads up. How many different microstates are there for this particular macrostate? So, let's just do it. We could add heads up, heads up, heads down, heads down. Heads up, down, up, down. Up, down, down, up. Down, up, down, up. Down, up, up, down. Or down, down, up, up. Those are all the possible arrangements for this macrostate. How many microstates are there? Each of these is a microstate that still results in two coins heads up. There are 6. So the multiplicity of this particular macrostate is 6. There are 6 different microstates available for the system in this particular macrostate. Mathematically, the entropy is best defined in terms of how many microstates are available to a particular macrostate. So, in terms of the multiplicity, the entropy is best defined as the Boltzmann constant, which we've seen before 1.38×10-23, the natural logarithm ln of the multiplicity. That's the entropy of a particular macrostate. Remember, different macrostates have the same entropy. Okay, entropy is a thermodynamic measurable at the macroscopic level. What's the entropy of a system of coins in the previous problem with a two heads up macrostate? Remember, that the multiplicity of Omega, is just going to be 1.38×10-23×ln(6), which if you plug this into a calculator ends up being 2.47×10-23J/K. Now based on this definition, the entropy is never going to be negative. The entropy will always be positive because you can never have fewer microstates than one. Remember all macrostates have to have at least one microstate. So this is mathematically the best possible way to define entropy. And as the disorder of a system increases, the number of available microstates increases for a particular macrostate, and so the entropy also increases. As the multiplicity goes up, as the number of available microstates goes up, the entropy goes up. So that's why systems tend to move toward more disorder. More available macrostates, more disorder, more entropy. That's where the second law of thermodynamics says the system wants to go. Okay, guys, that wraps up this discussion on microstates and macrostates and specifically how we can define entropy in terms of them. Thanks for watching, guys.

23. The Second Law of Thermodynamics

Statistical Interpretation of Entropy

23. The Second Law of Thermodynamics

# Statistical Interpretation of Entropy - Online Tutor, Practice Problems & Exam Prep

1

concept

### Microstates and Macrostates of a System

Video duration:

6mPlay a video:

#### Video transcript

2

Problem

ProblemThe macrostate of a set of coins is given by the number of coins that are heads-up. If you have 100 coins, initially with 20 heads-up, what is Δ? when the system is changed to have 50 heads-up? Note that the multiplicity of k coins which are heads-up, out of N total coins, is $\Omega=\frac{N!}{k!\left(N-k\right)!}$. Does this change in macrostate satisfy the second law of thermodynamics?

A

- 2.63 × 10^{-22} J/K

B

7.24 × 10^{-25} J/K

C

2.63 × 10^{-22} J/K

D

1.58 × 10^{-21} J/K

## Do you want more practice?

More sets### Your Physics tutor

Additional resources for Statistical Interpretation of Entropy

PRACTICE PROBLEMS AND ACTIVITIES (11)

- CALC A lonely party balloon with a volume of 2.40 L and containing 0.100 mol of air is left behind to drift in...
- A box is separated by a partition into two parts of equal volume. The left side of the box contains 500 molecu...
- Your calculator can't handle enormous exponents, but we can make sense of large powers of e by converting them...
- (II) Suppose that you repeatedly shake six coins in your hand and drop them on the floor. Construct a table sh...
- (II) Suppose that you repeatedly shake six coins in your hand and drop them on the floor. Construct a table sh...
- (II) Calculate the probabilities, when you throw two dice, of obtaining(a) a 7
- (II) Calculate the probabilities, when you throw two dice, of obtaining(b) an 11
- A bowl contains many red, orange, and green jelly beans, in equal numbers. You are to make a line of 3 jelly b...
- A bowl contains many red, orange, and green jelly beans, in equal numbers. You are to make a line of 3 jelly b...
- Rank the following five-card hands in order of increasing probability: (a) four aces and a king; (b) six of he...
- (I) Use Eq. 20–14 to determine the entropy of each of the five macrostates listed in Table 20–1 on page 595.&l...