This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

Loading...

En provenance du cours de University of Minnesota

Statistical Molecular Thermodynamics

146 notes

This introductory physical chemistry course examines the connections between molecular properties and the behavior of macroscopic chemical systems.

À partir de la leçon

Module 7

This module is relatively light, so if you've fallen a bit behind, you will possibly have the opportunity to catch up again. We examine the concept of the standard entropy made possible by the Third Law of Thermodynamics. The measurement of Third Law entropies from constant pressure heat capacities is explained and is compared for gases to values computed directly from molecular partition functions. The additivity of standard entropies is exploited to compute entropic changes for general chemical changes. Homework problems will provide you the opportunity to demonstrate mastery in the application of the above concepts.

- Dr. Christopher J. CramerDistinguished McKnight and University Teaching Professor of Chemistry and Chemical Physics

Chemistry

So, at this point, we've armed ourselves with the first and second laws of

thermodynamics. And it's time to add the last and third

arrow to our quiver. The Third Law of Thermodynamics.

Let me recall for you how I closed the last lecture.

And that is to note that an entropy change as you go from temperature 1 to

temperature 2. Can be determined by integrating the

constant pressure heat capacity over that temperature range divided by T.

So if we start at 0 Kelvin, that says that we can assign an absolute entropy at

a given temperature as the entropy at 0. Plus the integral now ranging from the

lower limit of 0 kelvin up to T2. Well, what about 0 kelvin?

And so here we have a, a picture of Walther Nemst.

Who was awarded the Nobel Prize in chemistry in 1920, part for his work on

thermodynamics. And Nernst made the suggestion, based on

a number of experimental studies that the change in entropy for chemical reactions

approached 0. As the absolute temperature approached 0.

So that's the change in entropy of reactance, going to products.

As the temperature goes to absolute zero. And Max Planck, who you'll recall from

all the way back in the first week when we discussed quantum mechanics had a

further refinement on that suggestion. He said that the entropy of a pure

substance, approaches 0 at 0 kelvin. And so a statement of the third law of

thermodynamics is, every substance has a finite positive entropy, but at 0 kelvin

the entropy may become 0. And it does so in the case of a perfectly

crystalline substance, and n some sense that is the definition of a perfectly

crystalline substance. It is one where the entropy goes to zero,

as the temperature goes to zero. So, let's pause for a moment, because

we're going to be dealing with entropy. I'd like to give you a chance to review

one of the expressions of entropy we've seen up until now.

And after you've had a chance to verify that you remember it, we'll return to

keep looking at the third law. All right, well, the third law was

actually proposed before quantum theory was fully developed.

But statistical thermodynamics a la Boltzmann, does provide some molecular

insight potentially into the third law. And so, if you remember, one of the

expressions for entropy, Boltzman's own expression, is that S equals k log W,

where W is a measure of the disorder. And so, at zero degrees Kelvin we do

expect that all of the systems in the ensemble will be in the lowest energy

state. That is that W would equal 1.

You'll recall when everything is in the same state, W takes on the value of one.

And in that case, you'd get k log W, k log 1, and the log of 1 is 0.

So, sure enough, the entropy would be equal to 0.

There is another way of looking at that as well, and that is to say the

probabilistic, expression for entropy. Namely, S is equal to minus k, sum over

all the states, probability of being in that state times the log of the

probability of being in that state. Well, if everything is in one state, the

ground state, then the probability, I'll call it zero here p0 is equal to 1, and

pj is equal to 0 for all the other states.

So you'd get, as you run over the states, 0, 1 times the log of 1, log of 1 is 0.

So, that contributes 0 to the entropy. And all these 0s, you get 0 times the log

of 0. And while log of 0 is undefined, 0 times

that thing, L'Hopital's rule, I think I, I mentioned before, establishes that that

too is 0. And so again this is consistent with

clones hypothesis that the entropy would be 0 at 0 kelvin.

Now there is a question that comes up, what if the ground state has degeneracy?

Well, let's take a look at that. Let's imagine that indeed the ground

state is degenerate. And so we'll work with the probability

expression for entropy. So, if the ground state is n-fold

degenerate, then there will be an equal probability, and that probability is 1

over n. Of being in any of those n-fold

degenerate states. And so, I would be summing, now, to a

definite limit I sum from one to n over the n degeneracees.

The probability is one over n, so I get the sum one over n log one over n.

Alright and so since I'm going to add this together n times, that's like kn

this expression. And n times 1 over n of course is just 1.

So all the stuff out front goes away except Boltzmann's constant.

And I'll change this negative sign to a positive sign by swapping from log 1 over

n to just log n. And so it says that the entropy for that

n fold degenerate ground state will be k log n.

And I'll ask you to remember that Boltzmann's constant is a very, very

small value, 1.38 times 10 to the minus 2 3rd jules per kelvin.

And so for a single system, even when n is very, very large, that would still be

a very, very small number once multiplied times Boltzmann's constant.

And so still very, very close to 0. Well, before manipulating things with the

third law more and using them to assign third law entropies.

It's worthwhile maybe to touch on a little bit of history that I think is

pretty interesting with respect to the Third Law.

And to do that let me introduce you to William Giaque.

So he was a professor of chemistry at the University of California, Berkeley.

And he was awarded the Nobel prize in 1949 for his contributions in the field

of chemical thermodynamics. Particularly concerning the behavior of

substances at extremely low temperatures. And how might you access a very low

temperature. We've actually already been exposed to a

way to do cooling. And that was to do adiabatic expansions

of ideal gases and those gasses cool. And if you then put them in contact with

something that's warmer, they'll suck the heat out of that warmer thing until

they're at equilibrium. And you can cycle them on and off to keep

pulling heat out of something. So let's just look at that process again

and reacquaint ourselves. So if we think of the, the vapor cycle

for refrigeration, I'll start with some gas and a piston on top of that gas and a

container. And if I compress it adiabatically, so I

isolate the system, it's not touching anything, it can't exchange heat with

anything. So I'll compress it into adiabatic

heating. I now have a smaller volume of gas, and

I've drawn it to be a little pink red, it's a warm gas.

And now I do put it in contact with some surroundings and I let it come into

equilibrium with those surroundings at a modest temperature.

So I allow heat to flow out of the system, so now I have a compressed gas at

a lower temperature. Then I expand it adiabatically and so

again I'm isolated so because it's expanding its temperature drops and so

it's become a blue gas, blue for cold. And now I place it in contact with

something that I would like to take the heat out of, that's at a warmer

temperature. Some of that heat will flow into the gas

and I can repeat this cycle. I keep dumping the heat into some outside

reservoir, where the thing I'm interested in cooling is what I keep taking the heat

from. So, I cycle again and again.

And I can cool something. And, how cold can I get something?

Well, that will depend on the gas. Because at some point, I will get to a

temperature of the thing I'm trying to cool, that is equal to the temperature at

which that gas liquefies. And I can't do a vapor cycle if I'm now

so cold that I don't have any vapor anymore, I just have a liquid.

Liquids are not compressible this way, and they don't do this sort of

refrigeration. And so you can ask yourself what is the

coldest you can get with a gas before it liquefies?

And the answer is helium. So helium only liquefies because of

dispersion which we've already discussed as, induced dipole, induced dipole

interaction. That brings otherwise, non-polar

molecules together. And helium liquefies around four degrees

kelvin. And actually if you separate the isotopes

of helium you can go even a little colder.

You can get close to three degrees kelvin with helium three.

However, that's it, and you, physicist's for a while thought maybe that's the

limit of what you should ever expect to do.

You can't get closer, there, there was no obvious process to get closer to absolute

zero. And it was Joe who had the idea of a

different kind of refrigeration cycle. And its called magnetic refrigeration.

And so, for certain materials, magnetic materials, that have magnetic moments

associated with molecules, atoms. It doesn't really matter for our purpose,

but let's just imagine that we have a material that is highly disordered.

So, a magnet has a direction, it's got a north and a south pole, if you like.

And at reasonably high temperatures, the interactions between the magnets are,

extremely weak. And so, thermally, they can point in all

sorts of directions and there's a lot of entropy associated with that, actually.

And what Giauque suggested, was, okay, if you place this material in a strong,

magnetic field. So that's what, bold face H is here.

It's not enthalpy its H representing a magnetic field.

Well, the spins, the magnetic magnetic moments I guess I'll call them, I won't

call them spins. The magnetic moments will align

themselves with the magnetic field. And you will have reduced the entropy

substantially. And because this is an adiabatic process,

so this label still applies. This is an adiabatic heating, the

temperature will go up. So I did this while the system was

isolated. And if the entropy went down the

temperature had to go up, because there was no heat flow from the outside.

At that stage with all the spins aligned, you can put it in contact with something

into which it will dump its excess heat. Lower its temperature, come to some new

temperature. Now turn the external magnetic field off

after isolating the system adiabatically. Well, the spins will, sorry the magnetic

moments will return to a state of high entropy; they will unalign with each

other. And in the process, because it's

adiabatic, the temperature must drop, so the material will cool.

At that stage, you can put it in contact with whatever it is that you would like

to cool further, maybe it's actually your liquid helium.

You'd like to make that liquid helium even colder, and study its properties as

it goes below 3 or 4 degrees kelvin. And, we just keep that cycle going.

So, you'd turn the magnetic field on, turn the magnetic field off.

Constantly isolating to do the adiabatic processes, placing into contact with the

bands you either want to cool or dump heat into.

And so with that process Giaque himself reached a temperature of 0.25 kelvin, so

a quarter of a kelvin. And much, much lower temperatures have

since been achieved by this process, which is also known as adiabatic

demagnetization. So, temperatures down to a thousandth of

a kelvin or lower. And to some extent that sounds a bit, you

know to go from a quarter to a thousandth, it's much much less than 1,

right? But remember that temperature appears as

a quantity in expressions like E to the minus delta G over RT.

And the difference between .25 and, say, .0025 is a factor of a hundred.

So it's not, it's not just a little bit of a degree, it's a hundred times hotter.

So there's enormous differences between what can happen at, say, 1 degree, 2

degrees, 3 degrees kelvin. And it's been very interesting to study

the properties of materials at these very low temperatures.

But more importantly, from, from the point of view of what we began this week

discussing. this establishes a way to get effectively

arbitrarily close to absolute zero and to have this base entropy, against which to

begin adding heat capacity entropies. In order to establish third law

entropies, absolute entropies. So we will look at that more as the

thermodynamics train continues on down this track.

And so next time we'll take a look at Standard Entropy.

Coursera propose un accès universel à la meilleure formation au monde,
en partenariat avec des universités et des organisations du plus haut niveau, pour proposer des cours en ligne.