This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

En provenance du cours de University of Minnesota

Statistical Molecular Thermodynamics

169 notes

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

À partir de la leçon

Module 6

This module introduces a new state function, entropy, that is in many respects more conceptually challenging than energy. The relationship of entropy to extent of disorder is established, and its governance by the Second Law of Thermodynamics is described. The role of entropy in dictating spontaneity in isolated systems is explored. The statistical underpinnings of entropy are established, including equations relating it to disorder, degeneracy, and probability. We derive the relationship between entropy and the partition function and establish the nature of the constant β in Boltzmann's famous equation for entropy. Finally, we consider the role of entropy in dictating the maximum efficiency that can be achieved by a heat engine based on consideration of the Carnot cycle. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

Alright, well, I think it's time for us to try to connect Entropy with the

Partition Function. So remember that we've derived partition

functions for monatomic, diatomic, polyatomic ideal gases.

And shown there are many properties of ideal gases that we can compute directly

from these partition functions. Let's make a connection between entropy

and partition function. And so, let me remind you, that from an

earlier video this week. We established that the entropy of an

ensemble was Boltzmann constant times a log a, minus sum over j, little a log a.

Where that capital a was how many systems are there in the ensemble and the little

a was the population of each of those systems.

So, that being tho, the definitions, you can also talk about the average entropy

of a given system. And that is just going to be the total

entropy of the ensemble divided by the number of systems.

You can also talk about the probability p sub j of choosing a system in state j,

and that is, how many are there in state j divided by how many there are total.

So, in that case, I could just say aj is equal to pj times capital A.

So let me substitute those expressions into the ensemble entropy.

That is, I'll go from little a's to little p's times capital A.

So I get a log a minus this expression. I'll expand this out a bit, so multiply

through. Both constant a log a minus k a here's

this a. This is a log of a product, so I'll just

keep playing this game log of a product is a sum of log rhythms.

So I'll get a p log p term and I'll get a log A term, A is just a number it's a

constant. So what comes out is k A log A and then a

sum over j of all the probabilities. But that sum is just the number 1.

If I consider the probability over all of the systems that adds up to 1.

I will pick a system. So the first term, k A log A, and the

last term k A log A, those drop out. And I'm left with the entropy of the

ensemble is k times A, p sum over p log p.

Okay, and that just expresses, yes indeed, the sum of all those

probabilities is one. Now if I were to divide both sides by A,

this A drops out on this term. And the entropy of the ensemble divided

by the total number of systems in the ensemble, that's the average system

entropy. And so, the system entropy is equal to

minus k, sum over the individual states, p log p.

So this is another way to write entropy. We've seen a lot of ways to write

entropy. K log w, k log omega, and here we have k

p log p. This is the probability form of the

entropy. And a couple of things if you're worried

about the fact that the probability could go to zero.

And the log of zero is negative infinity and that doesn't seem very good.

You can actually use L'Hopital's rule to establish that in the limit as x goes to

zero. X times log x is equal to 0, it does not

go to negative infinity, so that's nice. You'll also see that if all the

probabilities are 0, except for one, then for that single one, it'll be 1 times the

log of 1, log of 1 is 0, so I'll get the entropy is 0.

And that's what I expect, right? There's no disorder if everything is one

thing. In addition, you can show, and you would

use this, you'd have to use calculus to show this.

And a special little trick that only n minus one of the probabilities are

independent. That last probability depends on all the

others. But if you play around with that you

might be able to prove to yourself, that the entropy is maximized when all the

probabilities are equal for all possible states.

But what I want to focus on now is that, remember in the nv beta ensemble or nvt

remember that beta is just one over kt. In that ensemble, we had a way to define

the probability. It's e to the minus beta times the energy

sub j divided by the partition function, which is the sum over all possible

exponentials, all possible energies that is.

And so if I now swap that in for p, I get entropy is equal to minus Boltzmann's

Constant, sum over j. Here's my probability, here's the log of

my probability. And this is the log of a quotient.

So I'll take a difference of logs. So I get minus kb, here's this prefactor

term, log of an exponential. That just annihilates both those

functions. I'm just left with the argument of the

exponential, minus beta E sub j. And then meanwhile, I've got a minus log

q here, minus log q. So given this expression for the entropy.

I can manipulate it a little bit more, so recall that here I've got a probability.

Here I've got a beta, so that's a one over kt.

So if I pull all this out front, the ks will cancel and the negative signs

cancel, and I'm left with a one over t. And here's this energy term, meanwhile,

I've got a k times a log q and it is over q from this term.

So you can do the algebra yourself. But, what is this?

What is the sum of the probability weighted energies.

That is the internal energy. That defines the internal energy.

Meanwhile, what's this? Sum e to the beta Ej over all possible

j's, that's the partition function Q. So this Q cancels this Q.

So, I have this relatively simple expression S is equal to U over T plus k

log Q. Now let me write that in a slightly more

traditional form, which recognizes that U depends on the partition function that

we've already derived. So we get S is equal to kT partial log Q

partial T plus k log Q. So probability weighted energy is the

internal energy that was a key step we used.

This sum is equal to the partition function, a key step we used.

And the take-home message, which is particularly important, is that entropy

can be computed directly from the partition function.

Just as we have been successful with internal energy, with pressure, with heat

capacity. So let me consider then the entropy of a

monatomic ideal gas. So remember this is Q, capital Q, for a

monatomic, ideal gas. It's got something coming from

translation, it's got something coming from electronic degeneracy at a ground

state, and it's got an N, factorial term. So, if I take, the partial derivative of

the log of Q, with respect to T, well when I take the log, all these things

will separate out. Because logs take products, and

quotients, and make individual terms. The only thing that'll be left is a T,

the lo, log of T, and there's a 3N over 2 power.

So 3N over 2 will come out. I'll get derivative of log T with respect

to T, that's 1 over T. So there's the log term I need to worry

about. Meanwhile, log Q itself, that takes a

little longer to work with. When I take this log it's convenient to

remember that there's a 1 over N factorial term here.

So I'll put this over here as minus log N factorial.

I'm going to take this N out of these two exponents and multiply the logarithm, and

just leave behind this argument. I'll use Sterling's approximation to

simplify the log factorial term, and then I will take this log of N.

It's minus N log N. Well, I've got a log minus another log.

So I can divide by N. They're both already multiplied by N.

So that's why N appears here in the denominator.

I've put it underneath the volume. So this is a convenient way to have the

log expressed. Because now, I can work with this

expression for the entropy. Here's my partial log of Q partial T that

I'm going to need. Here's my law of Q, that I'm going to

need on this side. I've run out of space on this slide, so

let me try to pack all that back in on another slide to finish the, the

derivation. If, I take the molar entropy, that is my

N values here, are going to be my Avogadro's number.

Well, then I will get a k times log Q. Well, here's n, Avogadro's number so

that's going to introduce some R's. So, if you carry the multiplication all

the way out, here's the R times the log of all this quantity.

I'll get a k times Avogadro's number. Another factor of R, so that's just a

plain old R sitting off by itself. What do I get here?

I get 1 over T multiplies kT, the T's go away.

So I get Boltzmann's constant, times 3, times Avogadro's number over 2.

Well, the Avogadro's number times Boltzmann's constant is R, so I get three

halves R. We get three halves R from this term, and

another factor of R that came from this term.

That's where this five halves R comes from.

And in meanwhile, the remaining piece here is this part of log Q being

multiplied times K, using Avogadro's number.

So Avogadros number is now what appears here in the denominator.

So a couple things to notice about this expression.

One is, look, so let's pull all the way back to chemistry again and think about

concepts. What dictates whether the entropy is

large, lots of disorder, or small, not very much disorder?

Okay, well, let's just look at some of the terms that can be variable, one term

is the mass. So this is the mass of the gas.

And what we see here is that if it gets larger, the entropy will be larger.

And is that consistent with what we expect?

Well, this is actually, if you recall deriving from the translational partition

function. As the mass gets larger the density of

translational levels becomes greater. The levels get closer and closer

together. So they're more accessible.

There are more ways to distribute the gas in, the individual molecules that is, to

their translational levels, that is greater disorder.

So that's consistent with the way we should think about entropy.

What else can we control? We can control temperature, so as we

raise the temperature the entropy will increase.

And once again, the way to think about that is population of levels.

Now I haven't changed the spacing between the levels, they are whatever they are

for the gas, but by using a higher temperature, I can access more of those

levels. Right, the e to the minus something over

kT. As t gets bigger, the probability goes up

of getting into those levels, more disorder.

Volume, if we have a larger volume, the entropy goes up.

And so, that again makes sense. The, the spacing between the levels and

the particle in a box solution depends on the volume those levels are in.

The bigger the volume, the denser the spacing, all right?

So this is all consistent. And then finally, a given gas, that may

have a larger electronic ground state degeneracy that will also influence

things. And that one, in a sense, is, is a little

bit more trivial to see. You know, if I've got a ground state that

can be, spin up or spin down let's say. Maybe it's the hydrogen atom is an ideal

gas, slightly unusual ideal gas, but you can imagine it.

So up, down, same energy. There will be two possibilities and

that's greater disorder. So the entropy will increase by a factor

of R log 2 as opposed to 1 if there is no degeneracy in the ground state.

Well, okay, so that is a, look at the monatomic ideal gas.

Let's pause here for a moment. I'm going to let you think about

implications for a diatomic ideal gas. Alright, hopefully, the concepts we've

talked about that tries to tie the partition function and the entropy

together. But moreover to weave in the molecular

concepts and the molecular behavior of a gas.

That's a little more clear, we always should approach these things and ask sort

of sanity questions. Does the formula, is it consistent with

what I expect? Understanding the physics of the

molecules and the way they interact and their chemistry.

Next, what I want to do is come back to beta.

So we introduced beta some time ago and I told you to just accept on faith

essentially that it was 1 over Botlzmann's constant times the

temperature. But next I want to effectively prove that

finally.

Coursera propose un accès universel à la meilleure formation au monde,
en partenariat avec des universités et des organisations du plus haut niveau, pour proposer des cours en ligne.