Persona Digital Music

The Musical Brain

Sound Processing in the Brain


Sound waves transverse the ear canal and deflect the tympanic membrane which causes the small bones of the inner ear to produce fluid waves in the cochlea. The organ of Corti in the cochlea converts the fluid waves into nerve impulses. In a healthy ear, movement of hair cells that detect fluid waves is nonlinear… outer hair cells in each region of the long organ of Corti amplify sound of a particular frequency, so that each region is exquisitely tuned to one frequency and not to other frequencies. ”

Temporal Lobes

Musical sounds are processed in the temporal lobes in humans, especially in the left planum temporale. Pitch recognition is a built-in talent that is not uniformly inherited. Some humans are described as “tone deaf” when they cannot identify or reproduce pitches they hear. Pitch like, color is a subjective experience that correlates more or less with the wave frequency of the source. Sounds have a fundamental frequency with timbral harmonics as multiples of the fundamental. The same pitched note played on a trumpet and piano can be readily identified even though the waveforms on an oscilloscope are quite different. Bendor and Wang discovered neurons near the anterolateral border of the primary auditory cortex in marmoset monkeys that respond to both pure tones, providing a neural correlate for pitch constancy. They stated: “Pitch perception is critical for identifying auditory objects, in music and speech. Pitch is the subjective attribute of a sound's fundamental frequency (f0) determined by the temporal regularity and average repetition rate of its waveform. Spectrally dissimilar sounds can have the same pitch if they share a common f0.

The auditory system is organized into spatial and nonspatial, processing streams. In the monkey, the posterolateral auditory cortex is more responsive to spatial features stimuli than the anterolateral region that is more selective for vocalization. Single neurons in these cortical areas respond differentially to features of the auditory input. Neurons selectively responsive to vocalizations were found in the ventral prefrontal cortex. Neurons responsive to spatial features were found in the dorsal prefrontal cortex. The responsiveness of auditory neurons in both the prefrontal and parietal cortices is dependent on the significance of the stimulus. The superior temporal sulcus in humans exhibits selective activation for voices.

Rama et al used functional magnetic resonance imaging to study working memory for the location and identity of human voices. They found a preferential response to the voice identity task in ventral prefrontal cortex and suggested that during task using auditory working memory, maintenance of spatial and nonspatial information modulates activity preferentially in a dorsal and a ventral auditory pathway, respectively.

Platel et al used PET scanning to study the cerebral activation of volunteers performing 4 tasks: selective attention to pitch, timbre and rhythm and semantic familiarity with tunes. They observed that the left hemisphere was more active for familiarity, pitch and rhythm determination. The right hemisphere was more active for the timbre task. The familiarity task activated the left inferior frontal gyrus and superior temporal gyrus. The pitch task activations were observed in the left cuneus/precuneus. The rhythm task activated left inferior Broca's area with extension into the neighboring insula, suggesting the processing of sequential sounds.

There is an overlap in the pathways that process speech and musical sounds. There are convergent brain processors that can accept many kinds of input and use all the output options available to communicate. Koelsch et al, for example, suggest that “both music and language can prime the meaning of a word, and that music can, as language, determine semantic processing.”

The general plan of communication using sounds and written symbols involves a supramodal, movement modeling capacity that can create and retain schemas of action in the world and that some of these schemas are expressions that we refer to as emotions, some as language and some as music. It is not surprising that these three modes often merge as the most dramatic and moving form of human communication.

The posterior parietal cortex (PPC) in the macaque monkey integrates polymodal sensory information for object recognition and manipulation. According to Grefkes et al: “An area in the human anterior parietal cortex is activated when healthy subjects perform a crossmodal visuo-tactile delayed matching-to-sample task with objects…activity in this area was further enhanced when subjects transferred object information between modalities (crossmodal matching).”

The Musical Brain and other topics presented at Persona Digital Studio
are from the book, The Sound of Music by Stephen Gislason.
Click the Download button to order the eBook from Alpha Online

persona mandala

Persona Digital Studio is located on the Sunshine Coast, Sechelt, British Columbia, Canada.

All the recordings are arrangements, performances and recordings are completed in house by Stephen Gislason.