Lec 24: Auditory systems, part 1

Flash and JavaScript are required for this feature.

Download the track from iTunes U or the Internet Archive.

Description: This lecture is the first of two on auditory systems, but also includes a summary discussion of the visual systems.

Instructor: Gerard E. Schneider

The following content is provided under a Creative Commons license. Your support will help MIT OpenCourseWare continue to offer high quality educational resources for free. To make a donation or view additional materials from hundreds of MIT courses, visit MIT OpenCourseWare at ocw.mit.edu.

PROFESSOR: And we didn't quite finish, last time, the last part of the visual system. So let me say a little bit about the transcortical connections.

So what does it mean, this first statement? As neocortical area and neuron number increase in evolution, the amount of white matter increases also at a slightly greater rate. So what does that indicate? That would be one at issue to talk about.

In the visual system, there are specific long transcortical connections that have been emphasized in recent neuropsychology, because they seem to be very important in understanding some of the functions of the neocortex. So I talk about three transcortical pathways in the chapter.

Two of them are frequently talked about by cognitive neuroscientists. One of them is not so commonly talked about. But I have come to believe it's just as important. So I want to just quickly go through all three.

Can any of you summarize for me what the main functions of these three pathways is? Yes?

AUDIENCE: [INAUDIBLE].

PROFESSOR: Object location, that's right.

AUDIENCE: [INAUDIBLE].

PROFESSOR: Yeah, effective associations is a good way to say it. But I think in terms of cognitive psychology, they would say identification of objects. Because to make an association, you have to know what it is. So if you encounter it again and you form that kind of affective association, you retain that knowledge. So that's right. What about the third one?

AUDIENCE: [INAUDIBLE].

PROFESSOR: Yeah, I summarize it as, where am I. Now let me just--

AUDIENCE: [INAUDIBLE].

PROFESSOR: --let me explain. Without even looking at these slides, if I just-- when we talk about spatial location, we can mean where something is around our head. If I'm looking here, the clock is over there, in my left visual field. The windows are in my right visual field. The computer's in my central field, right now slightly in the lower field.

This is all object location. We call it egocentric localization. That is, not with respect to the ego, with respect to our head-- head and eyes. OK, that's egocentric location.

So what is allocentric? "Allo" means other. So my allocentric orientation is where I am in this room, where I am in the building, where I am at MIT. That's all allocentric. We'll talk about that when we look at the pathway to see what's involved in that.

And it will come up a number of times again. It's becomes so important for understanding the medial pallium, the hippocampal formation and associated structures, in the last 15, 20 years.

All right, this is often called the dorsal stream, leading from striate cortex, the first visual area. I put in a couple of arrows here. But I could have put in one big one. But they basically-- from all over area 17, visual area one, you get fibers projecting to juxtastriate areas, V2 primarily.

And from there, there are pathways that go to a number of places. And one of the places they go to is the posterior parietal area. And I've drawn here origins of several major bundles of axons that lead from the posterior parietal area into the prefrontal cortex. Now that posterior parietal area is concerned with spatial localization of things that we see around us.

All right, so this is often called the ventral stream pathway, also originating in area 17, also projecting to the prestriate, the V2 areas. And from those areas in V2 and from the posterior parietal areas that are getting input from the prestriate areas, you have pathways leading into the temporal lobe, that part of the visual cortex that's expanded so much in the temporalization of the hemispheres. And so I've just sketched it here, indicating that they come from the areas bordering area 17, from a number of different visual areas there. And they project to the inferior temporal cortex, the inferior gyrus of the temporal lobe.

And then for each of these pathways you can see there are pathways leading to the prefrontal. Here I show the dorsal pathways. I'll show you many other transcortical pathways in the monkey later. And they become confirmed, just about every one of them they know has a corresponding pathway in humans. That they know now from diffusion tensor imaging. Here's the ventral stream. And we know there are pathways going from the inferior temporal lobe into the ventral part of the prefrontal cortex.

Now, if I look at pathways from those areas, either from striate and prestriate areas, or from these areas in the frontal lobe, most of them in the frontal eye fields, an area very important for our working memory of things we're looking at-- we retain, briefly, positions of things around us. It affects the way we plan movements. But look at where they project.

Basically, you find projections from the frontal eye fields that are matched by projections from prestriate and striate. For one thing, they go to the superior colliculus. That would be the major structure there, the largest structure. And we know what that's concerned with, head and eye orienting.

But they also go to the corpus striatum, either directly from the occipital area or through the frontal eye fields. They also go to the subthalamus and the ventral lateral geniculate body.

If we look at the other areas, the ventral stream from the inferior temporal cortex, we get pathways to the amygdala, first described by Nauta. We get pathways from the amygdala or directly from the inferior temporal cortex going into the ventral prefrontal areas. That ventral prefrontal association cortex is the only area of neocortex that projects directly to the hypothalamus.

Now, there's other endbrain structures, like hippocampus or amygdala, that project directly to hypothalamus. But this is the only neocortical area that has a reasonably strong projection there. I don't show it being really strong, because in comparative terms it isn't.

A heavier projection goes to the ventral parts of the striatum, the ventral striatum. Very important in habit learning, as we know. And that has heavy projections to the hypothalamus.

The third pathway is concerned with allocentric orientation. And this was seen by Nauta in his studies of prefrontal cortex. But he included, in his reviews of this, a number of pathways he had discovered. And I found in two different papers, he had traced pathways from that posterior parietal area, the same area that we know projects to the frontal eye field.

It has a medial stream but projects, not just to cingulate cortex there, but to the retrosplenial area and parahippocampal gyrus. These are all areas that project to the hippocampus. That means they're concerned with where the animal is in spatial-- that kind of memory.

So I consider that to be on a par with the other two pathways. So I'm going to call it-- you could call it the medial stream. But sometimes it's represented this way, which is why I guess I didn't use those terms in my book. It's sometimes shown in the lateral view.

But knowing that anatomy, I think this is the better way to show it, because you could show the one going here. The parahippocampal gyrus, you could easily show it going around the edge like this. But this way it shows the whole pathway much better.

I'm not going to spend a lot of time talking. I just want to mention one thing about transcortical pathways from a theoretical viewpoint. Just from a theoretical standpoint, you could have nearby cortical areas just project to the nearby areas.

But always, to just say, three other areas. That would be called absolute connectivity. They all project to three nearby areas.

Or you could have them all project to all of the others. That would be proportional connectivity. That would involve enormous increases in white matter, as neocortical number increased. We know, from the way-- if you plot the volume of cortex versus the volume of the white matter, that they tend to-- it's fairly linear the way they're correlated. So we know it's got to be something closer to this.

But we also know that there are some of these long connections, like we just talked about. That type of connectivity is called small world architecture. And it's good to actually take a look at that, because it's so general in its relevance. Small world network connectivity is basic to allow social communication. It's basic to the spread of disease, and so on and so forth-- not just to the brain.

But what it means is you have regular connectivity. Each area or each cell-- if you want to, it can apply whole areas. It can apply to cells. They can access a number of nearby areas plus some random-- then, in theoretical terms, we just say the random.

This would be completely random here, with no bias towards connecting to local areas, completely random. In the middle here, in small world architecture, you have the connection to nearby areas plus a few long connections. And it appears that those long connections that have evolved did evolve for very specific adaptive reasons. So we will not say they're random.

But the result is still something like small world architecture. And they've plotted the amount of clustering, the amount of separation, how quickly can you get from one spot to the other. It's a very efficient kind of wiring. And they've studied pathways in the visual system in those terms and confirmed that it's a kind of small world architecture.

So now I want to spend a couple of classes on the auditory system. And if you have the book, there are a few mistakes in this chapters, because it was the last chapter-- I didn't like one of the figures the artist had helped me with. So I threw it out. And I put another one in.

And I did change the figure legend. But I didn't change the text. And there were a couple of other areas that-- obviously, it was done last minute. Not a very good way to finish things up, but that was the auditory system chapter.

So on page-- I can tell you exactly. If you want to note this down, you can just make the corrections in your book, because I, of course, didn't have all your books. Page 424 and 425, let me just point those out.

You'll remember them if I just tell you. You don't have to write it down now. But if you find that you've got your book-- on page 424, you'll see a figure were they called the stapes, which is one of the bones in the middle here, the one that connects to the eardrum, the tympanic membrane. They call it "staples". The artist didn't-- she had also a spell-checker that changed it. And I didn't catch it.

So it should be-- staples is a pretty good term, I guess. But it's actually the stapes. And on the other page there's a little parenthetical statement there, right at the bottom in the text, a parenthesis where the tectorial membrane is called the tympanic membrane. I have no idea how I did that. But I did.

I see that MIT Press now has the glossary online on the book site. And there will be an error page, too. If there are any other errors, please tell me about it, so we can post those. And then the next printing, they will get those in.

The second one is in that parenthesis there, where I say "also the tympanic". It's not the tympanic. The tympanic membrane is the eardrum. It's the tectorial membrane follows the coils of the cochlea, as does the basilar membrane.

Let's look at that. We'll look at the pictures. I just wanted you to correct that in the book.

And I think I actually-- you see, my original figure was an unrolled cochlea. And I think I still call it an unrolled cochlea, because that's what didn't get changed to correspond to the new figure. It doesn't matter. I'll show you the unrolled, so you'll see what the original figure looked like.

In the first class, I'm going to talk a little bit about early development and why the audition evolved the way it did, especially defensive and anti-predator predator behaviors, but also special abilities needed especially by predators. And then a little bit about the cochlear nuclei and the structures they're connected with. And then the next class, we'll talk about the separation of localization and pattern detection. It's quite different in audition than vision. But in the cortex, it ends up having some very great similarities to what we just talked about.

And then we'll have time to talk a little bit about specializations in the auditory system, echolocation, bird song, speech. So in the embryology of the brain stem, there are-- you've heard about the mechanosensory lateral line and the elecrotsensory lateral line. But we know that that doesn't occur in a lot of species.

But there are auditory placodes in the head area that lead to the formation of the primary sensory neurons in both the auditory and vestibular systems. These are present in all the vertebrate groups. So they're not like the lateral line systems.

There's just two cranial nerves involved that, in fact, combine auditory and vestibular. We group them together in the eighth cranial nerve. They're actually separate. But they follow a route into the hindbrain. They follow the same route. They're just two branches of the eighth nerve, one going to the vestibular canals, one going to the cochlea.

I want to just summarize visual pathways. The first picture, I didn't even put right at the beginning there in the book, because it looks so complex. So it's this one. It's shown on the schematic of a mammalian brain.

You see the cochlea, the cartoon here. And then I show the enlargement here of the eighth nerve coming into the cochlear nuclei. DCN, AVCN, dorsal cochlear nucleus, and anteroventral cochlear nucleus. You can just think of it as the ventral cochlear nucleus. There is an anterior and a posterior part.

And then we can follow various routes. Now what were those-- let's first of all just simplify this whole thing. But you see there's a reflex pathway here going locally, getting to a motor neuron. There's pathways like that controlling the startle reflex, for example.

And then you see a pathway to the cerebellum. So there's cerebellar pathways. That's one of the lemniscal pathways. And then other lemniscal pathways, which in the auditory system, at first wash, seem really complicated. Some of them are straightforward, going to inferior colliculus, then to the medial geniculate body, then to the auditory cortex.

Others seem complicated. One goes to the ventral part of the hindbrain. And then that projects by means of other nuclei into the inferior colliculus. And from that region, you also have pathways that totally bypass the inferior colliculus. In fact, they don't even go to the medial geniculate body. They go to structures around it.

So it seems complicated. So I've just drawn this kind of simplified diagram of the ascending pathways. So the peripheral ganglia, these would be the primary sensory neurons. Then you get to the secondary sensory nuclei, the cochlear nuclei. But also in hindbrain, you have tertiary structures in the ventral part of the hindbrain.

And then, both of these kinds of structures in the hindbrain project to the midbrain, with just a few axons in the mammals-- at least in some mammals, they may not happen in all. They go directly to the medial geniculate body.

And that would be the only thing like, say, the retinal geniculate pathway. It goes directly to the thalamus and the retina. Here we have the dorsal cochlear nucleus and some axons directly to the medial geniculate. Most of them go into the midbrain.

And I give the name there, inferior colliculus, for the mammals. But it's got other names in non-mammals, the torus semicircularis. If you were looking at a bird or reptile, it would be called that.

Then you have the diencephalon or tween brain, not just the medial geniculate, though that's the main one, but the posterior nuclear group, or group of nuclei that are around that structure. And you also have the old thalamus, the intralaminar nuclei. They also get input from the midbrain, carrying auditory information. Though it tends to overlap with the visual and somatosensory.

And then finally, the endbrain. And I point out, not only auditory cortex, but also part of the amygdala get direct connections from the auditory thalamus. And then of course, the intralaminar nuclei that also go to the striatum, not only to the cortex.

So let's talk a little bit about the evolution of this system before we go back to some of these details and try to pick out the main things. We know that escape behavior is always given precedence in evolution, because it's so critical that the animal survive. Or he can't reproduce.

So I'm asking you a behavioral question, here. If you've had my 920 class, you might remember this. I said describe an example of a fixed action pattern that is instinctive, pre-wired behavior, that's triggered in small mammals by the sounds of a predator. In fact, it's usually triggered by any really novel stimuli.

AUDIENCE: Freezing.

PROFESSOR: Freezing, very simple. Freezing is a very common first response. What good is that?

Most predators detect motion. And so if the animal freezes, it pretty much disappears from the attention of the predator. So very important.

If you're a hamster, for example, the first thing they will do is freeze if there's a novel stimulus. If the stimulus increases in spite of their freezing, then you will get-- just like that visual response we talked about-- you'll get rapid running. They don't run in any particular direction with respect to the predator. They run towards a safe haven. They run to their tunnel. They run to a hiding place.

So that's when they use their tectum. But what is triggered is then secondary to the rapid running. And there's some other beautiful fixed action patterns.

I probably described in the book the kangaroo rat that responds to a rattlesnake. He hears the noise of a rattlesnake, the rattle. And the kangaroo rat does freeze. And then he does something really odd. He just waits for the attack.

The rattlesnake attacks. You can imaginee-- here's that open mouth, rushing towards the kangaroo rat. As the head rushes through the air, the animal's auditory system-- the kangaroo rat's auditory system is tuned to respond to the noise of the onrushing rattlesnake head. And that triggers a rapid leap, in which he does a backward somersault, avoids the head of the rattlesnake. He lands just outside the range of the rattlesnake and escapes. It almost always works.

Yes?

AUDIENCE: [INAUDIBLE].

PROFESSOR: Sorry?

AUDIENCE: [INAUDIBLE].

PROFESSOR: I don't know. It's been observed in nature many times. And they have done special studies of the auditory system. These animals have huge air spaces around the cochlea.

I think I mentioned the escape behavior of the moth. Moths that hear the cry of a bat, they dive. They dive for the ground. And that's how they escape the bat. They just fold their wings and go into a nose dive. It's similarly a rapid escape behavior.

And then I ask a couple questions about two things closely related to this escape from a predator. One is the aversion to loud noises the rodents have. It's been studied best in rats. It and the other that's also been studied, mostly in rats. But now it's been studied in other animals as well. And that is learned fear.

That is, an animal can learn to be afraid of certain kinds of sounds by training. You give him a sound, and then you shock his feet. He becomes afraid and shows a fear response to that sound. Whereas before, it was just a neutral sound.

That's fear learning. It's been studied many times. And it involves pathways, now, that have been studied very well.

So first of all, in the study of avoidance of very loud noises, they've done these interesting studies where they make huge central nervous system lesions. They'll taking the inferior colliculus out. They take the entire auditory cortex out.

It doesn't change the way the animal responds to loud noises. In fact, it doesn't even change his ability to discriminate different amplitudes of noise at all-- of sounds. Because you can study-- teach him to press a lever to turn of the aversive noise. And he will keep doing that.

Unless, in the midbrain, you make-- there's a picture of the left side of the dorsal midbrain of a hamster. The studies were done in rats. But I'm showing here the superficial layers. The colliculus goes down to about here. This is called the central gray area. And there's the aqueduct of Sylvius.

You can see how the central gray stands out. The ocular motor nuclei would be right here. They look like just part of the central gray there.

And when they make these large lesions, they can carve out areas like this. [INAUDIBLE] on both sides. And the animal still will press the lever to turn the loud noise off.

But if you get into this ventral area-- the lesion goes down here to get all of that area, just a lesion right there-- will abolish that avoidance of-- learning to get rid of the loud noise. So this ventral part of central gray-- and we know from other studies of central gray that it's an activation that structure causes. It's always activated in pain, somatosensory pain.

So apparently, auditory stimuli that they hate, they avoid, activate these pain mechanisms as well. So that's what I'm explaining here. And this just to show you that that central gray part of the limbic regions, the limbic midbrain regions, meaning that they get heavy projections from the hypothalamus and from limbic endbrain areas-- central gray and ventral tegmental area.

But this dorsal part-- if you stimulate that, animals are uncomfortable. They will work to turn an electrical stimulus to that area off. Whereas if you're down in the ventral tegmental area, it's the opposite. They seem to be rewarded by stimulating there. It's a pleasurable result.

Now, you can also-- as I mentioned, you can expose the animal to sounds, say a tone of a certain frequency, and shock the feet of the animal. And he becomes afraid of the sound. Now, that learning is not specific to the central gray regions. That kind of learning-- as is often true, learning tends to depend on the forebrain.

In this case, we know the pathway goes to the amygdala. And this is a study of that pathway. Here in the opossum, the hedgehog, and [? the treeshoe. ?] It looks like my labels moved a little bit wrongly there.

But what they're doing is they're putting a label or a lesion in much of the medial geniculate body, the thalamic structure that's sending auditory pathways. They've done it in all three animals.

And then they trace all the axons, from medial geniculate body forward into the endbrain. So here you see them coming out of the internal capsule and going very heavily to the amygdala in the opossum. They also go, of course, to the auditory neocortex.

You see the same thing here in the hedgehog. The same thing here in the [? treeshoe. ?] But notice, in the [? treeshoe ?], in relative terms, it has a bigger neocortex. The auditory projections, the neocortex is the larger one.

In the opossum, the two projections are both very large. In the hedgehog, they're both large. But again, the one from the neocortex is a little bit larger. The hamster would be similar to the hedgehog, here. So would the mouse.

So that's something we didn't talk about in the visual system. It's not been as well studied. But now, we know there are visual pathways also going through the thalamus, but not the lateral geniculate body, through parts of the lateral nucleus that reach the amygdala without going to the visual cortex. But it's the pathway in the auditory system that's been the most studies, which is why we talk about it here.

Now especially predators-- they don't have to do-- they still, when they're young especially, they have to have those escape responses, too. But when they're older, they're preying on animals. They need to localize their prey. They need to identify the prey.

And those kinds of abilities evolved, of course, in the auditory system, too. So we need to talk about those pathways, Identifying and localizing.

So first of all, you need to discriminate differences in sound frequency. You need to combine those sound frequencies, temporal patterns of frequencies in different ways. You've got to have neurons that can respond specifically to those things.

And there was an evolution-- very different-- using auditory cues to localize. You know that a raptor, like an owl, is very good at localizing prey by auditory cues. They have to. They're night hunters.

So their mean way to detect prey on the forest floor below them is by auditory cues. So they can tell that there's a little rustling in the undergrowth or in the leaves on the forest floor. They not only hear it, but they know, better than we can, actually. They know where that little animal is.

So there was an evolution of apparatus, producing those cues. Now, to understand those things, I want to say a little bit about the initial stages of the auditory system, the peripheral auditory structures, or the primary sensory neurons, that's transduction. And then the initial coding of the auditory stimuli, and then the channels of conduction involved in those various functions in the brain.

So first of all, what do you remember about this? There's a transformation-- a transformation in the middle ear apparatus occurred in the early evolution of mammals that was different from reptiles. Remember, they evolved the mammal-like reptiles. And at some point in those mammal-like reptiles, certainly by the time you get to real mammals, you had this difference in the middle ear.

If you look in modern reptiles, you don't find it. But you find it in all mammals. And it was-- this is the picture that I put in the book. It shows-- Allman discusses it. And I think I listed some papers with people that have studied this and the paleontology.

This is an early mammal. And this is a dimetradon. It's one of the mammal-like reptiles.

And in a dimetradon, you see there's the stapes bone of the middle ear. But it goes directly from the eardrum into the oval window at the entrance to the cochlea. In mammals, you have three bones. Two of those bones were jawbones in reptiles. They evolved into middle ear bones.

We don't know all the steps. But we have enough skills to know about when that occurred. So what's going on in the middle ear?

What do you have to do in the middle ear? Sound is a vibration, movement in air. And it causes vibrations of the eardrum, tympanic membrane. We've got to get to vibrations of the fluid in the cochlea. That requires some impedance matching. And that can be inefficient or efficient.

It happens, of course, through the stapes here in the reptiles, directly from eardrum to the inner ear. But now you've got the vibrations in response to the air being transmitted directly to the fluid of the inner ear.

It turns out there's a more efficient way to do it. These little bones are connected. So there's a hinging. It changes. There's a greater movement of the outer bone that of the stapes. Yes?

AUDIENCE: [INAUDIBLE].

PROFESSOR: They usually don't have any. They do have some external ear. But some of them don't even at all. But they do have-- I mean, this is a reptile.

Of course, actually, from the paleontology, we don't know every detail. But we know from the dentition. And we know from the ear bones and a number of other features that are different in reptiles and mammals. And that's how they identify it. We also know because the eye region is different. There was actually some reduction in the bones around the eye in the earliest mammals.

All right, so it was an impedance matching problem. And the result of getting better impedance matching was response to higher frequencies.

So here's a turtle. And look at the-- he responds-- he doesn't respond well to tones above about 10 kilohertz. Here's a bird. This is a median. But the bird's range-- or in this range, they respond higher.

But look at some of these mammals. They can respond to very high sounds. Many mammals here, well above the human range of hearing. And young humans can hear up pretty high, too. You can probably hear considerably higher than I can, because a lot of that's lost with age.

So that was a big change, because now young, when they're having difficulties, emit cries. And for mammals, those cries are mostly very high frequency. When we have these pups born in the lab, we hear a little bit of squeaking from the young. But in fact, a lot of it, we don't even hear, because it's way above our hearing range. But the mothers can hear it.

The reptiles don't hear it. That gave a big advantage to the earliest mammals. They evolved at a time when reptiles were the dominant tetrapod.

So here you see where that ear apparatus is located. This is the middle ear chamber, an air-filled chamber connected to throat, by means of the Eustachian tube here. So there's the eardrum. There's the round window-- or the-- I always mix them up. It's the oval window.

So this is the end of the cochlea. And note that it looks like a snail. That's because the cochlea is actually an elongated tube. And so it will fit in the skull, it's coiled up. So it's actually like this.

This is an unrolled cochlea. So the apparatus that's sensitive to vibration of a fluid there runs down the middle. There's the oval window with the stapes connected to it, round window at the other side,

So this the vibrations-- the larger vibrations here are transformed into smaller vibrations here. That affects that fluid of the cochlea. And just note that the vestibular canals-- I didn't like this picture. It needed to be improved a little bit. Here you see a picture of the vestibular canals arranged in three planes that are off to the side there where the cochlea begins.

AUDIENCE: [INAUDIBLE].

PROFESSOR: Sorry?

AUDIENCE: [INAUDIBLE].

PROFESSOR: Yeah, we're going to-- let's just look at that right now. This is the-- we call it the organ of Corti. And if you make a cross section through the cochlea, you see you're cutting that coil at various points. This is from [INAUDIBLE], who did a lot of these studies at Harvard.

This is called the tympanic membrane. The smaller area here is the tectorial membrane. And little hair cells run between those two membranes.

So here, you see the organ of Corti enlarged. So there's the basilar membrane. There's the tectorial membrane.

And these are the two groups of hair cells. There's inner hair cells that receive most of the innervation that's responding to sound. And the outer hair cells have other functions that have evolved in mammals, to respond to those vibrations of the fluid.

It happens because of-- when you get movement of the basilar membrane, there's movement with respect to the tectorial membrane. And it causes a shearing force in those cells. The little hairs that protrude here connect this into the cell, to the tectorial membrane.

And it's those shearing forces that these little transduction cells are responding to. They are primary sensory neurons. I'm sorry, they are not-- they are receptor cells.

The primary sensory neurons are in the cochlear nerve. It's a bipolar cell. Their endings respond to the depolarization created in the receptors cells, which are these hair cells.

So the hair cells are not primary sensory neurons. We call them receptor cells. But it's the depolarization of the receptor cell membrane at the endings there. They're really the dendritic endings of the eighth nerve cells.

And those dendritic endings are-- their polarization changes. And you can get action potentials generated in those axons that travel towards the hindbrain.

So this question is, how is a place code use for encoding of sound frequency? And it's basically because the maximum vibration along the basilar membrane here is different for sounds of different frequencies.

The high frequencies and then the low frequencies stretched out. And if you-- history shows that in one picture. Here they show the unrolling of the cochlea.

I probably should've used it. I don't think I used this one in the book. But it's a nice one. It shows the relative amplitude of movement and how it changes for different frequencies.

So the very high frequencies vibrate the membrane best here. The low frequencies vibrate the membrane best here. You could thinking of them as standing waves, if you want. But all of these, of course, are very brief sounds.

And this is from data from [INAUDIBLE], where you have frequency plotted here, from very low frequencies up to almost 5,000 Hertz. And he always said kilocycles. But we changed it to Hertz. And you see the different positions of maximum vibration.

And that's how frequency is initially encoded. And intensity coding has got a similar kind of thing. But they are different. That's because different axons of the eighth nerve have different thresholds, depending on intensity. So that gives you, also, a place code for intensity.

Now, if you go to the cochlear nuclei, as I've diagrammed here-- dorsal and ventral cochlear nuclei-- and you put an electrode up here in a cat, and you just penetrate, going straight down through the cochlear nuclei, the best frequency-- that is the frequency at which you can use the lowest amplitude of sound and get responses in a cell-- changes very systematically. It's a very precise representation of frequency. And that is interpreted this way.

Here's the eighth nerve coming in to the ventral end of the ventral cochlear nucleus. There's the dorsal cochlear nucleus. And axons from different parts of the basilar membrane-- that is, axons picking up the contacting receptor cells at different positions along the basilar membrane-- terminate in different places, depending on which part of the basilar membrane they come from. So it's really a topography that's formed that's not unlike the retinal technical topography, except this is one-dimensional, instead of two-dimensional.

And note that an axon representing mainly one frequency will terminate by forming branches that stay in one plane along the going dorsal-- sorry, rostral to caudal. And then it does the same thing again in the dorsal cochlear nucleus. So it does a very similar thing in the two nuclei. And that's how you get that topographic representation of sound frequencies.

So this is just what I said. And what I want to do, then, is follow these channels of conduction through-- we're out of time. But I want to make a little more sense of this kind of diagram, of these different lemniscal channels going forward from the cochlear nuclei and trapezoid body, what's happening in the trapezoid body, and then what's happening to the axons going rostrally from these nuclei and from the trapezoid body.

So we'll do that next time. And go as far as we can with the auditory system next time.