Lecture 36: Time Dependence of Two-Level Systems: Density Matrix, Rotating Wave Approximation

Flash and JavaScript are required for this feature.

Download the video from Internet Archive.

Description: In this final lecture, Prof. Field explains time dependence of two-level systems, with attention to density matrix and rotating wave approximation.

Instructor: Prof. Robert Field

The following content is provided under a Creative Commons license. Your support will help MIT Open Courseware continue to offer high quality educational resources for free. To make a donation or to view additional materials from hundreds of MIT courses, visit MIT open courseware at ocw.mit.edu.

ROBERT FIELD: Today's lecture is one where-- it's a lecture I've never given before. And it's very much related to the experiments we are doing right now in my research group. And so basically, we have a chirped pulse of microwave radiation, which is propagating through a sample. It causes all of the molecules in the sample to be prepared in some way. We call it polarized.

And this polarization relaxes by what we call free induction decay. And they produce a signal which-- so we have a pulse of radiation that propagates through the sample. The two-level systems in the sample all get polarized, which we'll talk about today. And they radiate that polarization. And we collect it in a detector here.

And so the two important things are this is a time independent experiment, and that we have a whole bunch of molecules. And they're interacting with the radiation in a way which is complicated. Because this is not-- each one of them has quantum states, but all of the particles in this sample are somehow interacting with the radiation field in a way which is uncorrelated.

So we could say all of these particles are either bosons or fermions. But we're not going to symmeterize or anti-symmeterize. Each of these particles is independent. And we need a way of describing the quantum mechanics for an ensemble of independent particles. So it's a big step towards useful quantum mechanics.

And I'm not going to be able to finish the lecture as I planned it. So you should know where I'm going. And I'm going to be introducing a lot of interesting concepts.

The first 2/3 of lectures votes are typed. And you could have seen them. And the rest of them will be typed later today. This is based on material in Mike Fayer's book, which is referenced in your notes. This book is really accessible. It's not nearly as elegant as some of the other treatments of interaction of radiation with two-level systems.

Now I talked about interaction of radiation with two-level systems in lecture number 19. And this is a completely different topic from that, because in that, we were interested in many transitions. Let me just say the radiation field that interact with the molecule is weak. It interacts with all the molecules, and the theory is for a weak pulse-- and the important point in lecture 19 was resonance.

And so we made the dipole approximation. And each two-level system is separately resonant and is weakly interacted with, and does something to the radiation field. Now here, we're going to be talking about a two-level system, only two levels. And the radiation field is really strong, or is as strong as you want. And it does something to the two-level system which results in a signal.

And because the radiation field is strong, it's not just a matter of taking two levels and mixing them. The mixing coefficients are not small. It's not linear response. The mixing is sinusoidal. The stronger the radiation field, the mixing changes, and all sorts of interesting things happen.

So this is a much harder problem than what was discussed in lecture number 19. And in order to discuss it, I'm going to use some important tricks and refer to something called the density matrix. The first trick is we have this equation which can easily be derived. And most of this lecture, I'm going to be skipping derivations. Some of the derivations are going to be in the notes.

So we have some operator. And we want to know the time dependence of the expectation value of that operator. And it's possible to show that the expectation value of the operator a is given by the expectation value of the computator of a with the Hamiltonian plus the expectation value of the partial derivative of the operator a. So this is a general and useful equation for the time dependent of anything. And it's derived simply by taking the-- applying the chain rule to this sort of thing.

So we have three terms. And when you do that, you end up getting this equation. So this is just this ordinary equation. And anyway, so this is what happens.

So we're going to have some notation here. We have a wave function. And this is a capital psi, so this is a wave function, a time dependent wave function that satisfies the time dependent shorter equation. And we're going to replace that by just something called little t. And we can write this thing psi of x and t as the sum over n Cn psi n of x.

And this becomes in a bracket notation, becomes C Cn n. So we have a complete ortho normal set of functions. And this thing is normalized to one. And now I'm going to introduce this thing called the density matrix. This is a very useful quantum mechanical quantity which replaces the wave function.

It repackages everything we know from the time dependent shorter equation and the short in your picture of a wave function. It's equivalent. It's just arranging it in a different way. And this different way is extremely powerful, because what it does is it gets rid of a lot of complexity. I mean, when you have the time dependent wave functions, you have this e to the minus i E t over h-bar always kicking around.

And we get rid of that for most everything. And it also enables us to do really, really beautiful, simple calculations of the time dependence of expectation values. It's also a quantity where, if you have a whole bunch of different molecules in the system, each one of them has a density matrix. And those density matrices add. And you have the density matrix for an ensemble.

And so if the populations of different levels are different, the weights for each of the levels, or each of the systems, is taken care of. But we don't worry about coherences between particles unless we create coherence between particles. So this is a really powerful thing. And it's unlike the wave function, it's observable, because the diagonal elements of this matrix are populations. And the off diagonals elements, which we call coherences, are also observable.

And if you look at the Fourier transform of the emission from this system, it will consist of several frequencies. And those frequencies are the off diagonal elements with the amplitude, the relative weights of those frequencies. And so one can determine everything in the density matrix experimentally. Now it's really-- it's still indirect because you're making experimental measurements. But we think about this thing in a way we don't think about the wave function.

It's really important. And this is the gateway to almost all of modern quantum mechanics and statistical mechanics-- quantum statistical mechanics. And so this is a really important concept. And we've protected you from it until now. And since this is the last lecture both, in this course and in my teaching of this course forever, I want to talk about this gateway phenomena.

So what is this? Well, we denote it by this, this strange notation. I mean, you're used to this kind of notation where we have the overlap of bra with a ket and or abroad with itself. But this is different. You know, this is a number and this is a matrix.

And if we have a two-level system, then we can say that t is equal to C1 of t plus C2 of t. So state 1, state 2, and we have time dependence. Now those could be-- there's lots of stuff that could be in here. And this is going to be a solution of the time dependent shorter equation.

So since it's an unfamiliar topic, I'm going to spend more time talking about the mechanics than how you use that to solve this problem. But let's just look at it. So we have for a two-level level system, we have-- it's a matrix of 1, 1; a 1, 2; a 2, 1; and a 2, 2 element. And so we want the rho 1, 1 matrix on there. And so it's going to be a 1 here, then we're going to have a 1 here.

And then we have the C1, 1. Plus C2, 2. And then we have C1 star 1 plus C2 star 2. And so have I got-- am I doing it right now? So the first thing we do is we look at this inside part.

And we have C1, C1 star. And we have 1, 1. And we have C2, C2 star. Now I'm getting in trouble, because I want this to come out to be only C1, C1 star. So what am I doing wrong?

AUDIENCE: [INAUDIBLE] both have C2 halves the left hand and the right hand half are both [INAUDIBLE].

See, on the left side, you have 1 on 1 [INAUDIBLE].

ROBERT FIELD: So here--

And that's C2 C1 star, but that's 0. And so anyway, I'm not going to say more. But this combination is 1, this combination is 0, this combination is 0, this combination is 1. And we end up getting-- and then we end up just getting this.

Rho 1, 2 is equal to C1 C2 star. Rho 2, 1 is equal to C2 C1 star. And rho 2, 2 is equal to C2, C2 star. So we have the elements of this matrix. And they are expressed in terms of these mixing coefficients for the states 1 and 2.

Now, if we look at this, we can see that rho 1, 1 plus rho 2, 2 is equal to C1, C1 star plus C2, C2 star. And that's the normalization integral. That's 1. And we have 1, 1; 1, 2 is equal to 2, 1 star. And so each rho is formation.

So the density matrix is normalized to 1. And it's Hermitian matrix. And we can use all sorts of tricks for Hermitian matrices. Now we're interested in the time dependence of rho. And so we're going to use this wonderful equation up here in order to get the time dependence of rho because rho like a, is Hermitian operator.

And so we could do that. And so the time dependence of rho is going to be equal to the time dependence of t, t. Where we operating first here. And then t time dependent.

And when we do this, what we end up getting-- well, so we have a time dependence of a wave function. So we use the time dependence shorter equation. And we insert that. And using the time dependence shorter equation we have things like-- so every time we take the wave function-- the derivative of a function, we get a Hamiltonian and so on.

And so what we can express this time dependence of the density matrix by just using the time-- inserting the time dependent shorter equation repeatedly. This is why I say this is repackaging the Schrodinger picture, repackaging the wave function and writing everything in terms of these matrices. So that's the first-- that's what happens here. Sorry. That's what happens here.

And then we write this one, and we get plus 1 over minus i h-bar t, t H of t. And we recognize that that is just one over i h-bar times H rho. So the time dependence of the density matrix is given by this computator.

And the computators are kind of neat because usually what happens is these two-level things have very different structures, and you get rid of something you don't want to deal with anymore. And so now we actually evaluate these things. And we do a lot of algebra. And we get these equations of motion for the elements of the density matrix. And so we find the time dependence of the diagonal element for state 1 is opposite that for state 2.

In other words, population from state 1 is being transferred into state 2. And that is equal to minus i over h-bar times H 1, 2 rho 2, 1 minus h 2, 1 rho 1, 2. And we have rho 1, 2 time dependence is equal to rho 2, 1 time dependent star. And that comes out to be minus i h-bar minus i over h-bar, H 1, 1 minus H 2, 2 rho 1, 2 rho 2, 2 minus rho 1, 1 times H 1, 2.

This is very interesting, but now we have a couple differential equations and we can solve them. But we want to do a trick where we write the Hamiltonian as a sum of two terms. This is the time-- the independent part, and this is the time dependent part. This is the part that gives us trouble. This is the part that takes us into territory that I haven't talked about in time independent.

But it's still-- it's perturbation theory. This is supposed to be something that is different from and usually smaller than H 0. And so we do this. So H 0, operating on any function gives En times n. And so we could call these E zeroes, but we don't need to do that anymore.

And now we do a lot of algebra. We discover that the time dependence of the density matrix is given by minus i over h-bar times H 1 of t times the density matrix. So this is very much like what we did before, but now we have that the time dependence is entirely due to the time independent Hamiltonian.

So everything associated with H 0 is gone from this equation of motion. So now let's just be specific. So here is a two-level level system. This is state 1. This is state 2. This difference is delta E.

And we're going to call that h-bar omega 0. So this is the frequency difference between levels 1 and 2. H 0 is equal to minus h h-omega over 2 h-bar omega over 2 0, 0. We like that, right? It's diagonal.

h1 is where all the trouble comes. And we're going to call that h-bar times e x 1, 2 E0 cosine omega t. This is not an energy. This is an electric field. So this is the strength of the perturbation.

And this is the dipole matrix element between levels 1 and 2. So we have a dipole moment times an electric field multiplied by h-bar. So this quantity here, has units of angular frequency. And we call it omega 1, which is the Rabi frequency. It gets a special name because Rabi was special.

And so we're going to be-- and this is-- expresses the strength of the interaction. So we have a molecular antenna mu 1, 2. And we have the external field.

And they're interacting with each other. And so this is the strength of the badness, except its goodness. Because we want to see transitions.

So now we do a little bit of playing with notation because there's just a lot of stuff that's going on. and we have to understand it. So we're going to call the state-- we're going to separate the time dependent-- the time independent part of the wave functions from the time dependent. And so state 1-- this is the full time dependent wave function. And it's going to be minus i omega 0 t over 2. in Other words, we should have had zeros here-- times 1, right.

So this is the time independent part, and this is the time dependent part. And 2 is e to the minus i omega 0 t over 2, 2 prime. Notice these two guys have the same sign. This bothers me a lot. But it's true, because we have opposite signs here, and we have a bra and a ket.

And they end up having the same signs. So that means that h 1 looks like this, 0, 0 omega 1 cosine omega t e to the minus i omega 0 t. And here we have omega 1 cosine omega t e to the pi omega 0 t.

So this is a 2 by 2 matrix. Diagonal elements are 0. Off diagonal elements are this omega. The strength of the interaction times the frequency of the applied radiation times the oscillating factor.

So now we go back and we calculate the equation of motion, bringing in this h 1 term. And so we have minus i over h-bar h 1 rho. And we get some complicated equations of motions. And I don't really want to write them out, because it takes a while, and they're in your notes. And I'm going to make the crucial approximation, the rotating wave approximation.

Notice we have a cosine omega t. We can write that as e to the i omega t plus e to the minus i omega t. And so basically what we're doing is we're going to do a trick. We have the Hamiltonian, and we're going to go to a rotating coordinate system. And if we choose the rotational coordinate the rotation frequency right, we can almost exactly cancel omega 0 terms.

And so we have two terms, one rotating like this, which is canceling or trying to cancel omega 0, and one rotating like this, which is adding to omega 0. And so what we end up getting is a slowly oscillating term, which we like, and a rapidly oscillating term, which we can throw away. That's the approximation. And this is commonly used. And I can write this in terms of transformations.

And although we think about going to a rotating coordinate system, for each two-level system, we can rotate at a different frequency to cancel or make nearly canceling the off diagonal elements. So although the molecule doesn't rotate at different frequencies, our transformation attacks the coupling between states individually. And you can imply as many rotating wave core transformations as you want. But we have a two-level system. So we only have one.

And so we do this. And we skip a lot of steps, because it's complicated and because we don't have a lot of time. We now have the time dependence of the 1, 1 element. And it's expressed as omega 1.

I've skipped a lot of steps. But you can do those steps. The important thing is what we're going to see here. We have e to the i omega 0 minus omega t rho 1, 2. And we have a minus e to the minus i omega 0 minus omega t times rho 2, 1.

And we have 2, 2 dot is equal to minus rho 1, 1 dot. And we have rho 1, 2 dot-- this is the important guy-- is equal to i omega 1 over 2 e to the minus i omega 0 minus omega t rho 1, 1 minus rho 2, 2. So we have a whole bunch of coupled differential equations, but each of them have these factors here where you have omega 0 minus omega. I've thrown away the omega 0 plus omega terms.

And now it really starts to look good, because we can make these-- so when we make omega equal to omega 0, well, this is just 1. Everything is simple. We're on resonance. And so what we do is we create another symbol, delta omega, which is omega 0 minus omega. So this is the oscillating frequency applied.

This is the intrinsic level spacing in the molecule. And so we can now write the solution to this differential equation for each of the elements of the density matrix. And we're going to actually define another symbol. We going to have the symbol omega sub e. This is not the vibrational frequency.

This is just a symbol that is used a lot in literature, and that it comes out to be delta omega squared plus omega 1 squared. So in solving the density matrix equation, it turns out we care about this extra frequency. If delta omega is 0, well, then there's nothing surprising. Well, maybe e is just omega 1. But this allows for there to be an effect of the detuning.

So basically what you're doing is when you go to the rotating coordinate system, you have an intrinsic frequency separation. And so in the rotating coordinate system, you have two levels that are different. And there's a stark effect between them. And you diagonalize this stark effect using second order perturbation theory or just the diagonizing the matrix. And so that gives rise to this extra term here, because you have the oscillation frequency and the Rabi frequency.

And anyway, when you do the transformation, you get these terms. And so here is now the solution in the rotating wave approximation. Rho 1, 1 is equal to 1 minus omega 1 squared over omega e squared sine squared omega 0 t over 2. We have rho 2, 2 is equal to just omega 1 squared over the e squared sine squared omega e t over 2.

We have omega 1, 2-- rho 1, 2, which is equal to something more complicated looking. Omega 1 over omega e squared times i omega 0-- omega e, sorry, or 2 sine omega e t minus delta omega sine squared omega e t over 2 times now e to the minus i delta omega t. It looks complicated. And we get a similar term for who 2, 1. It's just equal to rho 1, 2 complex conjugate.

And so now what we see is these populations are oscillating not at-- It's a e, not a 0. They're oscillating at a slightly shifted frequency. But they're oscillating sinusoidally. And we have an amplitude term, which is omega 1 over omega e quantity squared.

Omega e is a little bigger than omega 1. So this is less than 1. So it's just like the situation in the absence of this oscillating field, that you just get a slightly, slightly shifted oscillation frequency, and a slightly reduced co-factor.

The coherence terms-- so these are populations, populations going back and forth between 1 and 2 at a slightly shifted frequency. And then we have this, which looks horrible. And now, for some more insights.

If we make omega 1 much larger than delta omega-- in other words, the Rabi frequency much larger than the tuning, it might as well not be detuned. We get back the simple picture. We get rho 1 is equal to cosine squared omega 1 of t over 2, et cetera. So we have what we call free precession.

Each of the elements of the density-- the density matrix is telling you that the system is going back and forth sinusoidally or co-sinusoidally cosine squared. And what happens to level 1 is the opposite of what happens at level 2. And everything is simple and the system just oscillates.

Suppose we apply radiation or delta t a short time. And so what we're interested in-- here is t equals 0. This is time. And this is t equals 0. And before t equals 0, we do something.

We apply the radiation. And we apply the radiation for a time, which gives rise to a certain flipping. And so what we choose. We have delta t is equal to theta over omega 1, or theta is equal to delta t omega 1, the Rabi frequency. And so if we choose a flip angle, which we call say a pi pulse, theta is a pi.

And what ends up happening is that we transfer population entirely from level 1 to level 2. When we do that, we get no off diagonal elements of the density matrix. They are zero. So if at t equals 0, we have everything in level 1, and we have applied this 0 pulse or a pi pulse, we have no coherence. If we have a pi over 2 pulse, well then, we've equalized the two-level populations, and we created a maximum coherence.

And this guy radiates. So now we have an oscillating dipole. And it's broadcasting radiation. And so all of the two-level systems, if you use a flip angle of pi over 2, you get a maximum polarization, they're radiating to my detector, which is up there. And I'm happy. I detect their resonance frequency.

And so the experiments work. So we're pretty much done. So I mean, what we are doing is we're creating a time dependent dipole. And that dipole radiates something which we call-- if we have a sample like this, that sample-- all of the molecules in the sample are contributing to the radiation of this dipole.

But they all have slightly different frequencies, because the field that polarized them wasn't uniform. In a perfect experiment it would be. And so they have different frequencies and they get out of phase. Or conservation of energy as the two-level system radiates from the situation where you have equal populations to everybody in the lowest state, there is a decay. So there's decays that causes the signal, which we call free induction decay, to dephase or decay.

But the important thing is, you observe the signal and it tells you what you want to know about the level system, the two-level system, or the end level system. And it's a very powerful way of understanding the interaction of radiation with matter, because it focuses on near resonance. And near resonance for one two-level system is not near resonance. For another-- and so you're picking out one, and you get really good signals. And you can actually do-- by chirping the pulse-- you can have one two-level system, and a little bit later, another two-level-level system.

They all radiate. They all get polarized. They all radiate at their own frequency. And you can detect the signal in the time domain and get everything you want in a simple experiment. This experiment has enabled us to do spectroscopy a million times faster than was possible before.

A million is a big number. And so I think it's important. And I think that this sort of theory is germane, not just for high resolution frequency domain experiments, in fact, it's basically a time domain experiment. You're detecting something in the time domain, and Fourier transferring back to the frequency domain. So there are ultra fast experiments where you create polarizations and they-- it is what is modern experimental physical chemistry.

And the notes that I will produce will be far clearer than these lectures, this lecture. But it really is a gateway. And I hope that some of you will walk through that gateway. And it's been a pleasure for me lecturing to you for the last time in 5.61. I really enjoyed doing this. Thanks.

[APPLAUSE]

Thank them.

[APPLAUSE]

Well, I got to take the hydrogen atom.

[LAUGHTER]

Thank you.