Statistical Interpretation of Entropy - Video Tutorials & Practice Problems

On a tight schedule?

Get a 10 bullets summary of the topic

1

concept

Microstates and Macrostates of a System

Video duration:

6m

Play a video:

Hey, guys, In this video, we're gonna introduce the concept of a macro state and a micro state of a system and how it pertains to the entropy of the system. All right, let's get to it right. So far, we whenever we've been talking about the state of a system we've been referring to simply the state. We've just been saying the word state that state that we've been talking about is more properly referred to as a macro states macro meaning macroscopic. Okay, this means pressure, volume, temperature, internal energy, entropy, etcetera in any of these macroscopic measurements that you can make on a system those defined its macro states. Okay, the macro states, defined by its measurable thermodynamic properties, right, Just like I listed for ideal gas is the pressure and the volume right and therefore the temperature, because I related by the ideal gas law, define a systems that should say macro state, that is a typo or an auto correct. In word that appeared something important to realize Is that a purpose? A particular macro state is not actually unique to a system A system can rearrange itself to make the same macro state multiple times. If there could be multiple arrangements of the system internally to make a particular macro state okay, it can have many, many different microscopic arrangements that produced the same macro states. For instance, two samples of a gas can have the same temperature, but they could have different positions off the gas particles that make up that gas that lead to the same temperature. That's a very, very easy one. Okay, you can think about all the different positions. In fact, the gasses particles air constantly changing position, even if the gas is in thermal equilibrium and its temperature isn't changing. Okay, A micro state of a system is just a single microscopic arrangement of a system that leads to a particular macro states. Okay, macro states typically have multiple micro states, but it has to be at least one right. At least one microscopic arrangement has to exist for a macro state. Otherwise, what's the point of even considering that a macro state? Because it's not possible for the system to arrange itself in that manner. Okay, The number of micro states for a particular macro state is called its multiplicity, which is given by the capital Greek letter Omega. Okay, so a multiplicity is particular to a macro state. So you could have a macro state of one. A macro state of two. Macro state, three macro state four. And they could all have different multiplicity. These different numbers of micro states available to them. Okay, let's do an example. Consider four coins. Ah, particular macro state for the system. Could be two coins. Heads up. And this should be too, because I cannot add two coins. Heads down. How many different micro states are there for this particular macro state? Okay, so let's just do it. We could have Heads up, Heads up, Heads down. Heads down! Right, Heads up, down, up, down, up, down, down, up, down, up, down, up, down, up, up, down or down! Down, up, up! Right, Those air! All the possible arrangements for this macro state. How many micro states are there? Each of these is a micro state that still results in two coins. Heads up. Right. There's So the multiplicity of this particular macro state is six. There are six different micro states available for the system in this particular macro state. Okay, mathematically, the entropy is best defined in terms off How many micro states are available to a particular macro state. So, in terms of the multiplicity, the entropy is best defined as the bolts hman constant, which we've seen before 1.38 times sent to negative 23. The natural logarithms Ellen off the multiplicity. That's the entropy of a particular macro state. Remember, different macro states have the same Sorry. Different micro states with one macro state have the same entropy. Okay, entropy is a thermodynamic measurable at the macroscopic level. All right, what's the entropy of a system of coins in the previous problem with a two heads up macro state, right. Remember that the multiplicity of that macro state is six. So the entropy of that macro state, which is Cabe Ellen of Omega, is just gonna be 138 times 10 to the negative. 23 times Ellen of six, which, if you plug this into a calculator, ends up being 247 times 10 to the negative. 23 jewels per Calvin. Okay. Now, based on this definition, the entropy is never going to be negative. Entropy will always be positive. because you can never have fewer micro states than one. Remember, all macro states have to have at least one micro state. Okay, so this is mathematically the best possible way to define entropy. And as the disorder of a system increases, the number of available micro states increases for particular macro state. And so the entropy also increases. Right As the multiplicity goes up. As the number of available micro states goes up, the entropy goes up. So that's why systems tend to move towards mawr disorder, mawr available micro states, mawr disorder or entropy. That's where the second locked aerodynamics says the system wants to go. Okay, guys, that wraps up this discussion on micro states and macro states and specifically, how we can define entropy in terms of them. Thanks for watching guys.

2

Problem

Problem

The macrostate of a set of coins is given by the number of coins that are heads-up. If you have 100 coins, initially with 20 heads-up, what is Δ? when the system is changed to have 50 heads-up? Note that the multiplicity of k coins which are heads-up, out of N total coins, is $\Omega=\frac{N!}{k!\left(N-k\right)!}$. Does this change in macrostate satisfy the second law of thermodynamics?

A

- 2.63 × 10^{-22} J/K

B

7.24 × 10^{-25} J/K

C

2.63 × 10^{-22} J/K

D

1.58 × 10^{-21} J/K

Do you want more practice?

We have more practice problems on Statistical Interpretation of Entropy

Additional resources for Statistical Interpretation of Entropy