
The auditory cortex primarily receives auditory information from a nucleus in the thalamus called the medial geniculate nucleus, which is where all incoming information about hearing is sent before it is processed by the cerebral cortex. For example, damage (e.g., like that caused by a stroke) might cause deficits in the ability to detect changes in pitch, localize sounds in space, or understand speech. Damage to the auditory cortex can disrupt various facets of auditory perception. The auditory cortex is also thought to be involved in higher-level auditory processing, such as recognizing aspects of sound that are specific to speech.

But it is also important in various other aspects of sound processing, like determining where in space a sound originates from as well as identifying what might be producing the sound. It is thought to be integral to our perception of the fundamental aspects of an auditory stimulus, like the pitch of the sound. The auditory cortex plays a critical role in our ability to perceive sound. What is the auditory cortex and what does it do? The demarcations of the auditory cortex in general, however, are imprecise. These neighboring areas are mostly buried within the lateral sulcus as well, but may extend out to the superior temporal gyrus. The region adjacent to the core is often referred to as the belt region, and surrounding that is an area often called the parabelt region. For example, in some individuals the primary auditory cortex seems to occupy one Heschl’s gyrus, while in others it may extend past that gyrus into a neighboring sulcus (or beyond). The precise location of the primary region in humans is variable, however, as is the arrangement of Heschl’s gyri (some people have one of these gyri, while others have two or three). The primary auditory cortex in humans is hidden within the lateral sulcus on a collection of gyri known as Heschl’s gyri (aka the transverse temporal gyri). There is general agreement, however, that the auditory cortex consists of a primary area-which is often referred to as the core region-as well as multiple non-primary areas. The auditory cortex can be subdivided into multiple regions, although there is still some question about the most appropriate way to create those subdivisions in the human brain. Some auditory cortex is visible on the external surface the brain, however, as it extends to a gyrus called the superior temporal gyrus. Most of it is hidden from view, buried deep within a fissure called the lateral sulcus. The auditory cortex is found in the temporal lobe. Meanwhile, the enhancement of information processing and the reliability of representation of the stimulus by cNEs suggest that cNEs should be considered the principal unit of information processing in AI.A coronal section of the left hemisphere, showing the primary auditory cortex (red) as well as surrounding auditory regions (blue and purple).

Since single neurons can participate in multiple cNEs over the course of a recording, I also show that neurons can multiplex information, and encode slightly different spectro-temporal information, if they encode spectro-temporal information at all, when associated with different cNEs. cNEs also come in two flavors – one of them enhances stimulus representation over single neurons or simultaneously recorded random groups of neurons of the same size, while the other does not represent spectro-temporal features at all, and might reflect internally generated neuronal activity. These cNEs are meaningful constructs that are active in both spontaneous and evoked activity, and their synchronous evoked activity cannot be trivially explained by receptive field overlap. In this dissertation, I show that I can accurately detect coordinated neuronal ensembles (cNEs), which we define as groups of neurons that have reliable synchronous activity, in AI. Determining how AI encodes information will hence require an integrated approach that combines receptive field and multi-neuronal ensemble analyses. Meanwhile, some recent studies have also shown how populations of AI neurons can also encode auditory behavior. Despite that, most studies of information processing in AI focus on either single-unit spectro-temporal receptive field (STRF) estimation, or paired neuronal correlation analyses, and assume that AI neurons filter auditory information either as individual entities or pairs.

The primary auditory cortex (AI) is made up of highly interconnected populations of neurons that are responsible for integrating bottom-up auditory information from the lemniscal auditory pathway and top-down inputs from higher-order cortical areas.
