Lecture 19: Interacting Particles Part 5

Flash and JavaScript are required for this feature.

Download the video from iTunes U or the Internet Archive.

Description: This is the fifth of five lectures on Interacting Particles.

Instructor: Mehran Kardar

The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high-quality educational resources for free. To make a donation, or view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.

PROFESSOR: Begin with a new topic, which is breakdown of classical statistical mechanics. So we developed a formalism to probabilistically describe collections of large particles. And once we have that, from that formalism calculate properties of matter that have to do with heat, temperature, et cetera, and things coming to equilibrium. So question is, is this formalism always successful?

And by the time you come to the end of the 19th Century, there were several things that were hanging around that had to do with thermal properties of the matter where this formalism was having difficulties. And the difficulties ultimately pointed out to emergence of quantum mechanics. So essentially, understanding the relationship between thermodynamics, statistical mechanics, and properties of matter was very important to development of quantum mechanics.

And in particular, I will mention three difficulties. The most important one that really originally set the first stone for quantum mechanics is the spectrum of black body radiation. And it's basically the observation that you heat something. And when it becomes hot, it starts to radiate. And typically, the color of the radiation that you get is a function of temperature, but does not depend on the properties of the material that you are heating. So that has to do with heat. And you should be able to explain that using statistical mechanics.

Another thing that we have already mentioned has to do with the third law of thermodynamics. And let's say the heat capacity of materials such as solids. We mentioned this Nernst theorem that was the third law of thermodynamics based on observation. Consequence of it was that heat capacity of most things that you can measure go to 0 as you go to 0 temperature. We should be able to explain that again, based on the phenomena of statistical-- the phenomenology of thermodynamics and the rules of statistical mechanics.

Now, a third thing that is less often mentioned but is also important has to do with heat capacity of the atomic gases such as the air in this room, which is composed of, say, oxygen and nitrogen that are diatomic gases. So probably, historically they were answered and discussed and resolving the order that I have drawn for you. But I will go backwards. so we will first talk about this one, then about number two-- heat capacity of solids-- and number three about black body radiation. OK.

Part of the reason is that throughout the course, we have been using our understanding of the gas as the sort of measure of how well we understand thermal properties of the matter. And so let's stick with the gas and ask, what do I know about the heat capacity of the gas in this room? So let's think about heat capacity of dilute diatomic gas.

It is a gas that is sufficiently dilute that it is practically having ideal gas law. So PV is roughly proportional to temperature. But rather than thinking about its pressure, I want to make sure I understand something about the heat capacity, another quantity that I can measure. So what's going on here?

I have, let's say, a box. And within this box, we have a whole bunch of these diatomic molecules. Let's stick to the canonical ensemble. So I tell you the volume of this gas, the number of diatomic molecules, and the temperature. And in this formalism, I would calculate the partition function. Out of that, I should be able to calculate the energy, heat capacity, et cetera. So what do I have to do?

I have to integrate over all possible coordinates that occur in this system. To all intents and purposes, the different molecules are identical. So I divide by the phase space that is assigned to each one of them.

And I said it is dilute enough that for all intents and purposes, the pressure is proportional to temperature. And that occur, I know, when I can ignore the interactions between particles.

So if I can ignore the interactions between particles, then the partition function for the entire system would be the product of the partition functions that I would write for the individual molecules, or one of them raised to the N power. So what's the Z1 that I have to calculate?

Z1 is obtained by integrating over the coordinates and momenta of a single diatomic particle. So I have a factor of d cubed p. I have a factor of d cubed q. But I have two particles, so I have d cubed p1, d cubed q1, d cubed p2, d cubed q 2, and I have six pairs of coordinate momenta. So I divide it by h cubed.

I have e to the minus beta times the energy of this system, which is p1 squared over 2m p2 squared over 2m. And some potential of interaction that is responsible for bringing and binding these things together. So there is some V that is function of q1 and q2 that binds the two particles together and does not allow them to become separate. All right, so what do we do here?

We realize that immediately for one of these particle,s there is a center of mass that can go all over the place. So we change coordinates to, let's say, Q, which is q1 plus q2 over 2. And corresponding to the center of mass position, there is also a center of mass momentum, P, which is related to p1 minus p2. But when I make the change of variables from these coordinates to these coordinates, what I will get is that I will have a simple integral over the relative coordinates. So I have d cubed Q d cubed big P h cubed.

And the only thing that I have over there is e to the minus beta p squared divided by 2 big M. Big M being the sum total of the two masses. If the two masses are identical, it would be 2M. Otherwise, it would be M1 plus M2.

And then I have an integration over the relative coordinate. Let's call that q relative momentum p h cubed e to the minus beta p squared over 2 times the reduced mass here. And then, the potential which only is a function of the relative coordinate.

Point is that what I have done is I have separated out this 6 degrees of freedom that make up-- or actually, the 3 degrees of freedom and their conjugate momenta that make up a single molecule into some degrees of freedom that correspond to the center of mass and some degrees of freedom that correspond to the relative motion.

Furthermore, for the relative motion I expect that the form of this potential as a function of the separation has a form that is a minimum. Basically, the particles at 0 temperature would be sitting where this minimum is. So essentially, the shape of this diatomic molecule would be something like this if I find its minimum energy configuration. But then, I can allow it to move with respect to, say, the minimum energy. Let's say it occurs at some distance d. It can oscillate around this minimum value.

If it oscillates around this minimum value, it basically will explore the bottom of this potential. So I can basically think of this center of mass contribution to the partition function. And this contribution has a part that comes from these oscillations around the center of mass. Let's call that u. Then, there is the corresponding momentum. I don't know, let's call it pi. I divide by h and I have e to the minus beta pi squared over 2 mu. And then I have minus beta.

Well, to the lowest order, I have v of d, which is a constant. And then I have some frequency, some curvature at the bottom of this potential that I choose to write as mu omega squared over 2 multiplying by u squared. Essentially, what I want to do is to say that really, there is a vibrational degree of freedom and there is a harmonic oscillator that describes that. The frequency of that is related to the curvature that I have at the bottom of this potential. So this degree of freedom corresponds to vibrations.

But that's not the end of this story because here I had three q's. One of them became the amplitude of this oscillation. So basically, the relative coordinate is a vector. One degree of freedom corresponds to stretching, but there are two other components of it. those two other components correspond to essentially keeping the length of this fixed but moving in the other directions. What do they correspond to?

They correspond to rotations. So then there is essentially another partition function that I want right here. That corresponds to the rotational degrees of freedom.

Now, the rotational degrees of freedom have a momentum contribution because this p is also three components. One component went in to the vibrations. There are two more components that really combine to tell you about the angular momentum and the energy that is proportional to the square of the angular momentum. But there is no restoring force for them. There is no corresponding term that is like this. So maybe I will just write that as an integral over angles that I can rotate this thing. An integral over the two components of the angular momentum divided by h squared. There's actually two angles. And the contribution is e to the minus beta angular momentum squared over 2I. So I wrote the entire thing.

So essentially, all I have done is I have taken the Hamiltonian that corresponds to two particles that are bound together and broken it into three pieces corresponding to the center of mass, to the vibrations, and to the rotations.

Now, the thing is that if I now ask, what is the energy that I would get for this one particle-- I guess I'll call this Z1-- what is the contribution of the one particular to the energy of the entire system?

I have minus the log Z1 with respect to beta. That's the usual formula to calculate energies. So I go and look at this entire thing. And where do the beta dependencies come from? Well, let's see.

So my Z1 has a part that comes from this center of mass. It gives me a V. We expect that. And then from the integration over the momenta, I will get something like 2 pi m over beta h squared to the 3/2 power.

From the vibrations-- OK, what do I have?

I have e to the minus beta V of d, which is the constant. We really don't care. But there are these two components that give me root 2 pi mu divided by beta. There is a corresponding thing that comes from the variance that goes with this object, which is square root of 2 pi divided by beta mu omega squared. The entire thing has a factor of 1/h. So this is the vibrations. And for the rotations, what do I get?

I will get a 4 pi from integrating over all orientations. Divided by h squared. I have essentially the two components of angular momentum. So I get essentially, the square of 2 pi I divided by beta. So this is rotations. And this is center of mass.

We can see that if I take that formula, take its log divide by-- take a derivative with respect to beta. First of all, I will get this constant that is the energy of the bond state at 0 temperature. But the more interesting things are the things that I take from the derivatives of the various factors of beta.

Essentially, for each factor of beta in the denominator, log Z will have a minus log of beta. I take a derivative, I will get a factor of 1 over beta. So from here, I will get 3/2 1 over beta, which is 3/2 kT. So this is the center of mass.

From here, I have two factors of beta to the 1/2. So they combine to give me one factor of kT. This is for vibrations. And similarly, I have two factors of beta to the 1/2, which correspond to 1 kT for rotations.

So then I say that the heat capacity at constant volume is simply-- per particle is related to d e1 by dT. And I see that that amounts to kb times 3/2 plus 1 plus 1, or I should get 7/2 kb. Per particle, which says that if you go and calculate the heat capacity of the gas in this room, divide by the number of molecules that we have-- doesn't matter whether they are oxygens or nitrogen. They would basically give the same contribution because you can see that the masses and all the other properties of the molecule do not appear in the heat capacity. That as a function of temperature, I should get a value of 7/2.

So basically, C in units of kb. So I divide by kb. And my predictions is that I should see 7/2. So you go and do a measurement and what do you get?

What you get is actually 5/2. So something is not quite right. We are not getting the 7/2 that we predicted. Except that I really mentioned that you are getting this measurement when you do measurements at room temperature, you get this value. So when we measure the heat capacity of the gas in this room, we will get 5/2.

But if we heat it up, by the time we get to temperatures of a few thousand degrees Kelvin. So if you heat the room by a factor of 5 to 10, you will actually get the value of 7/2.

And if you cool it, by the time you get to the order of 10 degrees or fewer, then you will find that the heat capacity actually goes even further. It goes all the way to 3/2. And the 3/2 is the thing that you would have predicted for a gas that had monatomic particles, no internal structure. Because then the only thing that you would have gotten is the center of mass contribution.

So it seems like by going to low temperatures, you somehow freeze the degrees of freedom that correspond to vibrations and rotations of the gas. And by going to really high temperatures, you are able to liberate all of these degrees of freedom and store energy in them. Heat capacity is the measure of the ability to store heat and energy into these molecules. So what is happening?

Well, by 1905, Planck Had already proposed that there is some underlying quantization for heat that you have in the black body case. And in 1905, Einstein said, well, maybe we should think about the vibrational degrees of the molecule also as being similarly quantized. So quantize vibrations.

It's totally a phenomenological statement. We have to justify it later. But the statement is that for the case where classically we had a harmonic oscillator. And let's say in this case we would have said that its energy depends on its momentum and its position or displacement-- I guess I called it u-- through a formula such as this. Certainly, you can pick lots of values of u and p that are compatible with any value of the energy that you choose.

But to get the black body spectrum to work, Planck had proposed that really what you should do is rather than thinking of this harmonic oscillator as being able to take all possible values, that somehow the values of energy that it can take are quantized. And furthermore, he had proposed that they are proportional to the frequency involved. And how did he guess that?

Ultimately, it was related to what I said about black body radiation. That as you heat up the body, you will find that there's a light that comes out and the frequency of that light is somehow related to temperature and nothing else. And based on that, he had proposed that frequencies should come up in certain packages that are proportional to-- the energies of the particular frequencies should come in packages that are proportional to that frequency. So there is an integer here n that tells you about the number of these packets.

And not that it really matters for what we are doing now, but just to be consistent with what we currently know with quantum mechanics, let me add the 0 point energy of the harmonic oscillator here. So then, to calculate the contribution of a system in which energy is in quantized packages, you would say, OK, I will calculate a Z1 for these vibrational levels, assuming this quantization of energy. And so that says that the possible states of my harmonic oscillator have energies that are in these units h bar omega n plus 1/2.

And if I still continue to believe statistical mechanics, I would say that at a temperature t, the probability that I will be in a state that is characterized by integer n is e to the minus beta times the energy that corresponds to that integer n. And then I can go and sum over all possible energies and that would be the normalization of the probability that I'm in one of these states. So this is e to the minus beta h bar omega over 2 from the ground state contribution. The rest of it is simply a geometric series.

Geometric series, we can sum very easily to get 1 minus e to the minus beta h bar omega. And the interesting thing-- or a few interesting things about this expression is that if I evaluate this in the limit of low temperatures.

Well, actually, let's go first to the high temperature where beta goes to 0. So t goes to become large, beta goes to 0. Numerator goes to 1, denominator I can expand the exponential. And to lowest order, I will get 1 over beta h bar omega.

Now, compare this result with the classical result that we have over here for the vibration. Contribution of a harmonic oscillator to the partition function.

You can see that the mu's cancel out. I will get 1 over beta. I will get h divided by 2 pi. So if I call h divided by 2 pi to be h bar, then I will get exactly this limit. So somehow this constant that we had introduced that had dimensions of action made to make our calculations of partition function to be dimensionless will be related to this h bar that quantizes the energy levels through the usual formula of h being h bar-- h bar being h over 2 pi. So basically, this quantization of energy clearly does not affect the high temperature limit. This oscillator at high temperature behaves exactly like what we had calculated classically. Yes?

AUDIENCE: Is it h equals h bar over 2 pi? Or is it the other way, based on your definitions above?

PROFESSOR: Thank you. Good. All right?

So this is, I guess, the corresponding formula. Now, when you go to low temperature, what do you get?

You essentially get the first few terms in the series. Because at the lowest temperature you get the term that corresponds to n equals to 0, and then you will get corrections from subsequent terms.

Now, what this does is that it affects the heat capacity profoundly. So let's see how that happens. So the contribution of 1 degrees of freedom to the energy in this quantized fashion d log Z by d beta. So if I just take the log of this expression, log of this expression will get this factor of minus beta h bar omega over 2 from the numerator. The derivative of that will give you this ground state energy, which is always there. And then you'll have to take the derivative of the log of what is coming out here.

Taking a derivative with respect to beta, we'll always pick out a factor of h bar omega. Indeed, it will pick out a factor of h bar omega e to the minus beta h bar omega. And then in the denominator, because I took the log, I will get this expression back.

So again, in this expression, if I take the limit where beta goes to 0, what do I get?

I will get this h bar omega over 2. It's always there. Expanding these results here, I will have a beta h bar omega. It will cancel this and it will give me a 1 over beta. I will get this kT that I had before.

Indeed, if I am correct to the right order, I will just simply get 1 over beta. Whereas, if I go to large beta, what I get is this h bar omega over 2 plus a correction from here, which is h bar omega e to the minus beta h bar. And that will be reflected in the heat capacity, which is dE by dT.

This h bar omega over 2 does not continue to heat capacity, not surprisingly. From here, I have to take derivatives with temperatures. They appear in the combination h bar omega over kT. So what happens is I will get something that is of the order of h bar omega. And then from here, I will get another h bar omega divided by kb T squared. I will write it in this fashion and put the kb out here.

And then the rest of these objects will give me a contribution that is minus h bar omega over kT divided by 1 minus e to the minus h bar omega over kT squared. The important thing is the following--

If I plot the heat capacity that I get from one of these oscillators-- and the natural units of all heat capacities are kb, essentially. Energy divided by temperature, as kb has that units. At high temperatures, what I can see is that the energy is proportional to kT. So heat capacity of the vibrational degree of freedom will be in these units going to 1.

At low temperatures, however, it becomes this exponentially hard problem to create excitations. Because of that, you will get a contribution that as T goes to 0 will exponentially go to 0. So the shape of the heat capacity that you would get will be something like this.

The natural way to draw this figure is actually what I made the vertical axis to be dimensionless. So it goes between 0 and 1. I can make the horizontal axis to be dimensionless by introducing a theta of vibrations, so that all of the exponential terms are of the form e to the minus T over this theta of vibrations, which means that this theta of vibration is h bar omega over kb. That is, you tell me what the frequency of your oscillator is. I can calculate the corresponding temperature, theta. And then the heat capacity of a harmonic oscillator is this universal function there, presumably at some value that is of the order of 1. It switches from being of the order of 1 to going exponentially to 0. So basically, the dependence down here to leading order is e to the minus T over theta vibration. OK.

So you say, OK, Planck has given us some estimate of what this h bar is based on looking at the spectrum of black body radiation. We can, more or, less estimate the typical energies of interactions of molecules. And from that, we can estimate what this frequency of vibration is. So we should be able to get an order of magnitude estimate of what this theta y is.

And what you find is that theta y is of the order of 10 to the 3 degrees Kelvin. It depends, of course, on what gas you are looking at, et cetera. But as an order of magnitude, it is something like that. So we can now transport this curve that we have over here and more or less get this first part of the curve that we have over here.

So essentially, in this picture what we have is that there is no vibrations. The vibrations have been frozen out. And here you have vibrations.

Of course, in all of the cases, you have the kinetic energy of the center of mass. And presumably since we are getting the right answer at very high temperatures now, we also have the rotations. And it makes sense that essentially what happened as we go to very low temperatures is that the rotations are also frozen out.

Now, that's part of the story-- actually, you would think that among all of the examples that I gave you, this last one should be the simplest thing because it's really a two-body problem. Whereas, solids you have many things. Radiation, you have to think about the electromagnetic waves, et cetera. That somehow, historically, this would be the one that is resolved first.

And indeed, as I said in 1905, Einstein figured out something about this. But this part dealing with the rotational degrees of freedom and quantizing them appropriately had to really wait until you had developed quantum mechanics beyond the statement that harmonic oscillators are quantized in energy. You had to know something more.

So since in retrospect we do know something more, let's finish and give that answer before going on to something else. OK?

So the next part of the story of the diatomic gas is quantizing rotations. So currently what I have is that there is an energy classically for rotations that is simply the kinetic energy of rotational degrees of freedom. So there is an angular momentum L, and then there's L squared over 2I. It looks pretty much like P squared over 2M, except that the degrees of freedom for translation and motion are positions. They can be all over the place. Whereas, the degrees of freedom that you have to think in terms of rotations are angles that go between 0, 2 pi, or on the surface of a sphere, et cetera.

So once we figure out how to do quantum mechanics, we find that the allowed values of this are of the form h bar squared over 2I l, l plus 1, where l now is the number that gives you the discrete values that are possible for the square of the angular momentum.

So you say OK, let's calculate a Z for the rotational degrees of freedom assuming this kind of quantization. So what I have to do, like I did for the harmonic oscillator, is I sum over all possible values of l that are allowed. The energy e to the minus beta h bar squared over 2I l, l plus 1.

Except that there is one other thing, which is that these different values of l have degeneracy that is 2l plus 1. And so you have to multiply by the corresponding degeneracy. So what am I doing over here?

I have to do a sum over different values of l, contributions that are really the probability that I am in these different values of the index l-- 0, 1, 2, 3, 4, 5, 6. And I have to add all of these contributions.

Now, the first thing that I will do is I ask whether the limit of high temperatures that I had calculated before is correctly reproduced or not. So I have to go to the limit where temperature is high or beta goes to 0. If beta goes to 0, you can see that going from one l to another l, it is multiply this exponent by a small number. So what does that mean?

It means that the values from one point to another point of what am I summing over is not really that different. And I can think of a continuous curve that goes through all of these points.

So if I do that, then I can essentially replace the sum with an integral. In fact, you can systematically calculate corrections to replacing the sum with an integral mathematically and you have a problem set that shows you how to do that. But now what I can do is I can call this combination l, l plus 1 x. And then dx will simply be 2l plus 1 dl.

So essentially, the degeneracy works out precisely so that when I go to the continuum limit, whatever quantization I had for these angular momenta corresponds to the weight or measure that I would have in stepping around the l-directions.

And then, this is something that I can easily do. It's just an integral dx e to the minus alpha x. The answer is going to be 1 over alpha, or the answer to this is simply 2I beta h bar squared. So this is the classical limit of the expression that we had over here. Let's go and see what we had when we did things classically.

So when we did things classically, I had two factors of h and 2 pi and 4 pi. So I can write the whole thing as h bar squared and 2. I have I, and then I have beta. And you can see that this is exactly what we have over there.

So once more, properly accounting for phase space, measure, productive p q's being dimension-- made dimensionless by this quantity h is equivalent to the high temperature limit that you would get in quantum mechanics where things are discretized. Yes.

AUDIENCE: When you're talking about the quantum interpretations, then h bar is the precise value of Planck's constant, which can be an experimental measure.

PROFESSOR: Right.

AUDIENCE: But when you're talking about the classical derivations, h is just some factor that we mention of curve dimension.

PROFESSOR: That's correct.

AUDIENCE: So if you're comparing the limits of large temperatures, how can you be sure to establish the h bar in two places means the same thing?

PROFESSOR: So far, I haven't told you anything to justify that. So when we were doing things classically, we said that just to make things dimensionless, let's introduce this quantity that we call h.

Now, I have shown you two examples where if you do things quantum mechanically properly and take the limit of going to high temperatures, you will see that the h that you would get-- because the quantum mechanical partition functions are dimensionless quantities, right?

So these are dimensionless quantities. They have to be made dimensionless by something. They're made dimensionless by Boltzmann's constant. By a Planck's constant, h bar. And we can see that as long as we are consistent with this measure of phase space, the same constant shows up both for the case of the vibrations, for the case of the rotations. And very soon, we will see that it will also arise in the case of the center of mass.

And so there is certainly something in the transcriptions that we ultimately will make between quantum mechanics and classical mechanics that must account for this. And somehow in the limit where quantum mechanics is dealing with large energies, it is indistinguishable from classical mechanics. And quantum partition functions are-- all of the countings that we do in quantum mechanics are kind of unambiguous because we are dealing with discrete levels.

So if you remember the original part of the difficulty was that we could define things like entropy only properly when we had discrete levels. If we had a continuum probability distribution and if we made a change of variable, then the entropy was changed. But in quantum mechanics, we don't have that problem. We have discretized values for the different states. Probabilities will be-- once we deal with them appropriately be discretized. And all of the things here are dimensionless.

And somehow they reproduce the correct classical dynamics. Quantum mechanics goes to classical mechanics in the appropriate high-energy limit. And what we find is that what happens is that this shows up.

If you like, another way of achieving-- why is there this correspondence?

In classical statistical mechanics, I emphasize that I should really write h in units of p and q. And it was only when I calculated partition functions in coordinates p and q that were canonically conjugate that I was getting results that were meaningful.

One way of constructing quantum mechanics is that you take the Hamiltonian and you change these into operators. And you have to impose these kinds of commutation relations. So you can see that somehow the same prescription in terms of phase space appears both in statistical mechanics, in calculating measures of partition function, in quantum mechanics. And not surprisingly, you have introduced in quantum mechanics some unit for phase space p, q. It shows up in classical mechanics as the quantity [INAUDIBLE].

But there is, indeed, a little bit more work than I have shown you here that one can do. Once we have developed the appropriate formalism for quantum statistical mechanics, which is this [INAUDIBLE] performed and appropriate quantities defined for partition functions, et cetera, in quantum statistical mechanics that we will do in a couple of lectures. Then if you take the limit h bar goes to 0, you should get the classical integration over phase space with this factor of h showing up. But right now, we are just giving you some heuristic response.

If I go, however, in the other limit, where beta is much larger than 1, what do I get? Basically, then all of the weight is going to be in the lowest energy level, 0, 1. And then the rest of them will be exponentially small. I cannot replace the sum with an integral, so basically I will get a contribution that starts with 1 for l equals to 0. And then I will get 3e to the minus beta h bar squared divided by 2I. l being 1, this will give me 1 times 2. So I will have a 2 here. And then, higher-order terms.

So once you have the partition function, you go through the same procedure as we described before. You calculate the energy, which is d log Z by d beta. What do you get?

Again, in the high temperature limit you will get the same answer as before. So you will get beta goes to 0. You will get kT.

If you go to the low temperature limit-- well, let's be more precise. What do I mean by low temperatures? Beta larger than what?

Clearly, the unit that is appearing everywhere is this beta h bar squared over 2I, which has units of 1 over temperature from beta. So I can introduce a theta for rotations to make this demonstrate that this is dimensionless. So the theta that goes with rotations is h bar squared over 2I kb.

And so what I mean by going to the low temperatures is that I go for temperatures that are much less than the theta of these rotations. And then what happens is that essentially this state will occur with exponentially small probability and will contribute to the energy and amount that is of the order of h bar squared 2I times 2. That's the energy of the l equals to 1 state. There are three of them, and they occur with probability e to the minus theta rotation divided by T times a factor of 2.

All of those factors is not particularly important. Really, the only thing that is important is that if I look now at the rotational heat capacity, which again should properly have units of kb, as a function of temperature. Well, temperatures I have to make dimensionless by dividing by this rotational heat capacity.

I say that at high, temperature I get the classical result back. So basically, I will get to 1 at high temperatures. At low temperatures, again I have this situation that there is a gap in the allowed energies. So there is the lowest energy, which is 0. The next one, the first type of rotational mode that is allowed has a finite energy that is larger than that by an amount that is of the order of h bar squared over I.

And if I am at these temperatures that are less than this theta of rotation, I simply don't have enough energy from thermal fluctuations to get to that level. So the occupation of that level will be exponentially small. And so I will have a curve that will, in fact, look something like this.

So again, you basically go over at a temperature of the order of 1 from heat capacity that is order of 1 to heat capacity that is exponentially small when you get to temperatures that are lower than this rotational temperature.

AUDIENCE: Is that over-shooting, or is that--

PROFESSOR: Yes. So you have a problem set where you calculate the next correction. So there is the summation replacing the sum with an integral. This gives you this to the first order, and then there's a correction. And you will show that the correction is such that there is actually the approach to one for the case of the rotational heat capacity is from above. Whereas, for the vibrational heat capacity, it is from below. So there is, indeed, a small bump. OK?

So you can ask, well, I know the typical size of one of these oxygen molecules. I know the mass. I can figure out what the moment of inertia I is. I put it over here and I figure out what the theta of rotation is. And you find that, again, as a matter of order of magnitudes, theta of rotations is of the order of 10 degrees K. So this kind of accounts for why when you go to sufficiently low temperatures for the heat capacity of the gas in this room, we see that essentially the rotational degrees of freedom are also frozen out. OK.

So now let's go to the second item that we have, which is the heat capacity of the solid. So what do I mean?

So this is item 2, heat capacity of solid. And you measure heat capacities for some solid as a function of temperature. And what you find is that the heat capacity has a behavior such as this. So it seems to vanish as you to go to lower and lower temperatures. So what's going on here?

Again, Einstein looked at this and said, well, it's another case of the story of vibrations and some things that we have looked at here. And in fact, I really don't have to do any calculation. I'll do the following.

Let's imagine that this is what we have for the solid. It's some regular arrangement of atoms or molecules. And presumably, this is the situation that I have at 0 temperature. Everybody is sitting nicely where they should be to minimize the energy.

If I go to finite temperature, then these atoms and molecules start to vibrate. And he said, well, basically, I can estimate the frequencies of vibrations. And what I will do is I will say that each atom is in a cage by its neighbors. That is, this particular atom here, if it wants to move, it find that its distance to the neighbors has been changed.

And if I imagine that there are kind of springs that are connecting this atom only to its neighbors, moving around there will be some kind of a restoring force. So it's like it is sitting in some kind of a harmonic potential. And if it tries to move, it will experience this restoring force. And so it will have some kind of a frequency. So each atom vibrates at some frequency. Let's call it omega E.

Now, in principle, in this picture if this cage is not exactly symmetric, you may imagine that oscillations in the three different directions could give you different frequencies. But let's ignore that and let's imagine that the frequencies is the same in all of these. So what have we done?

We have reduced the problem of the excitation energy that you can put in the atoms of the solid to be the same now as 3N harmonic oscillators of frequency omega. Why 3N?

Because each atom essentially sees restoring force in three directions. And forgetting about boundary effects, it's basically three per particle. So you would have said that the heat capacity that I would calculate per particle in units of kb should essentially be exactly what we have over here, except that I multiply by 3 because each particular has 3 possible degrees of freedom. So all I need to do is to take that green curve and multiply it by a factor of 3. And indeed, the limiting value that you get over here is 3.

Except that if I just take that green curve and superpose it on this, what I will get is something like this. So this is 3 times harmonic oscillator. What do I mean by that, is I try to sort of do my best to match the temperature at which you go from one to the other. But then what I find is that as we had established before, the green curve goes to 0 exponentially. So there is going to be some theta associated with this frequency. Let's call it theta Einstein divided by T.

And so the prediction of this model is that the heat capacities should vanish very rapidly as this form of exponential. Whereas, what is actually observed in the experiment is that it is going to 0 proportional to T cubed, which is a much slower type of decay. OK?

AUDIENCE: That's negative [INAUDIBLE]?

PROFESSOR: As T goes to 0, the heat capacity goes to 0. T to the third power. So it's the limit-- did I make a mistake somewhere else? All right. So what's happening here?

OK, so what's happening is the following. In some average sense, it is correct that if you try to oscillate some atom in the crystal, it's going to have some characteristic restoring force. The characteristic restoring force will give you some corresponding typical scale for the frequencies of the vibrations. Yes?

AUDIENCE: Is this the historical progression?

PROFESSOR: Yes.

AUDIENCE: I mean, it seems interesting that they would know that-- like this cage hypothesis is very good, considering where a quantum [INAUDIBLE] exists. I don't understand how that's the logic based-- if what we know is the top board over there, the logical progression is that you would have-- I don't know.

PROFESSOR: No. At that time, the proposal was that essentially if you have oscillator of frequency omega, its energy is quantized in multiples of omega. So that's really the only aspect of quantum mechanics. So I actually jumped the historical development where I gave you the rotational degrees of freedom.

So as I said, historically this was resolved last in this part because they didn't know what to do with rotations. But now I'm saying that you know about rotations, you know that the heat capacity goes to 0. You say, well, solid is composed. The way that you put heat into the system, enhance its heat capacity, is because there is kinetic energy that you put in the atoms of the solid. And as you try to put kinetic energy, there is this cage model and there's restoring force.

The thing that is wrong about this model is that, basically, if you ask how easy it is to give energy to the system, if rather than having one frequency you have multiple frequencies, then at low temperatures you would put energy in the lower frequency. Because the typical scale we saw for connecting temperature and frequency, they are kind of proportional to each other. So if you want to go to low temperature, you are bound to excite things that have lower frequency.

So the thing is that it is true that there is a typical frequency. But the typical frequency becomes less and less important as you go to low temperature. The issue is, what are the lowest frequencies of excitation?

And basically, the correct picture of excitations of the solid is that you bang on something and you generate these sound waves. So what you have is that oscillations or vibrations of solid are characterized by wavelength and wave number k, 2 pi over lambda.

So if I really take a better model of the solid in which I have springs that connect all of these things together and ask, what are the normal modes of vibration?

I find that the normal modes can be characterized by some wave number k. As I said, it's the inverse of the wavelength. And frequency depends on wave number. In a manner that when you go to 0k, frequency goes to 0. And why is that?

Essentially, what I'm saying is that if you look at particles that are along a line and may be connected by springs. So a kind of one-dimensional version of a solid. Then, the normal modes are characterized by distortions that have some particular wavelength. And in the limit where the wavelength goes to 0, essentially--

Sorry, in the limit where the wavelength goes to infinity or k goes to 0, it looks like I am taking all of the particles and translating them together. And if I take the entire solid here and translate it, there is no restoring force. So omega has to go to 0 as your k goes to 0, or wavelength goes to infinity. And there is a symmetry between k and minus k, in fact, that forces the restoring force to be proportional to k squared. And when you take the square root of that, you get the frequency. You always get a linear behavior as k goes to 0.

So essentially, that's the observation that whatever you do with your solid, no matter how complicated, you have sound modes. And sound modes are things that happen in the limit where you have long wavelengths and there is a relationship between omega and k through some kind of velocity of sound.

Now, to be precise there are really three types of sound waves. If I choose the direction k along which I want to create an oscillation, the distortions can be either along that direction or perpendicular to that. They can either be longitudinal or transfers. So there could be one or two other branches. So there could, in principle, be different straight lines as k goes to 0.

And the other thing is that there is a shortest wavelength that you can think about. So if these particles are a distance a apart, there is no sense in going to wave numbers that are larger than pi over a. So you have some limit to these curves. And indeed, when you approach the boundary, this linear dependence can shift and change in all kinds of possible ways.

And calculating the frequency inside one of these units that is called a Brillouin zone is a nice thing to do for the case of using methods of solid state. And you've probably seen that. And there is a whole spectrum of frequencies as a function of wave number that correctly characterize a solid.

So it may be that somewhere in the middle of this spectrum is a typical frequency omega E. But the point is that as you go to lower and lower temperatures, because of these factors of e to the minus beta h bar omega, you can see that as you go to lower and lower temperature, the only things that get excited are omegas that are also going to 0 proportionately to kT.

So I can draw a line here that corresponds to frequencies that are of the order of kT over h bar. All of the harmonic oscillators that have these larger frequencies that occur at short wavelengths are unimportant. They're kind of frozen, just like the vibrations of the oxygen molecules in this room are frozen. You cannot put energy in them. They don't contribute to heat capacity.

But all of these long wavelength modes down here have frequencies that go to 0. Their excitation possibility is large. And it, indeed, these long wavelength modes that are easy to excite and continue to heat capacity. I'll do maybe the precise calculation next time, but even within this picture we can figure out why the answer should be proportional to T cubed.

So what I need to do, rather than counting all harmonic oscillators-- the factor of 3n-- I have to count how many oscillators have frequencies that are less than this kT over h bar. So I claim that number of modes with frequency less than kT over h bar goes like kT over h bar cubed V.

Essentially, what I have to do is to do a summation over all k that is less than some k max. This k max is set by this condition that Vk max is of the order of kT over h bar. So this k max is of the order of kT over h bar V.

So actually, to be more precise I have to put a V here. So I have to count all of the modes.

Now, this separation between these modes-- if you have a box of size l is 2 pi over l. So maybe we will discuss that later on. But the summations over k you will always replace with integrations over k times the density of state, which is V divided by 2 pi cubed. So this has to go between 0 and k max. And so this is proportional to V k max cubed, which is what I wrote over there.

So as I go to lower and lower temperature, there are fewer and fewer oscillators. The number of those oscillators grows like T cubed. Each one of those oscillators is fully excited as energy kT contributes 1 unit to heat capacity. Since the number of oscillators goes to 0 as T cubed, the heat capacity that they contribute also goes to 0 as T cubed.

So you don't really need to know-- this is actually an interesting thing to ponder. So rather than doing the calculations, maybe just think about this. That somehow the solid could be arbitrarily complicated. So it could be composed of molecules that have some particular shape. They are forming some strange lattice of some form, et cetera.

And given the complicated nature of the molecules, the spectrum that you have for potential frequencies that a solid can take, because of all of the different vibrations, et cetera, could be arbitrary complicated. You can have kinds of oscillations such as the ones that I have indicated.

However, if you go to low temperature, you are only interested in vibrations that are very low in frequency. Vibrations that are very low in frequencies must correspond to the formations that are very long wavelength.

And when you are looking at things that are long wavelength, this is, again, another thing that has statistical in character. That is, rather you are here looking at things that span thousands of atoms or molecules. However, as you go to lower and lower temperature, more and more atoms and molecules. And so again, some kind of averaging is taking place. All of the details, et cetera, wash out. You really see some global characteristic.

The global characteristic that you see is set by this symmetry. Just the fact that when I go to exactly k equals to 0, I am translating. I have 0 frequency. So when I'm doing something that is long wavelength, the frequency should somehow be proportional to that wavelength. So that's just a statement of continuity if you like.

Once I have made that statement, then it's just a calculation of how many modes are possible. The number of modes will be proportional to T cubed. And I will get this T cubed law irrespective of how complicated the solid is. All of the solids will have the same T cubed behavior.

The place where they come from the classical behavior to this quantum behavior will depend on the details of the solid, et cetera. But the low temperature law, this T cubed law, is something that is universal.

OK, so next time around, we will do this calculation in more detail, and then see also its connection to the blackbody radius.