Title: 

Developing an animal Model to study Auditory-Motor Interactions During Rhythm Perception

Funding detail: 
NIH R21
Institution: 
Tufts University
Principal Investigator: 
Mimi Hsiao Feng Kao, PhD
Project summary: 

One of the most familiar aspects of music is the beat: a steady perceived pulse to which we readily clap our hands or tap our feet. Beat perception is a core component of music cognition, involving detection of periodicity in auditory rhythms and precise temporal prediction of future events. This ability to detect and predict the beat is central to music's positive effect on a variety of neurological disorders, including improving phonological processing in dyslexia, enhancing language recovery after stroke, and normalizing gait in Parkinson's disease. Yet the neural mechanisms underlying beat perception are not well understood, and progress is impeded by the lack of an animal model that allows precise measurement and manipulation of neural circuits during beat perception. Human neuroimaging studies indicate that beat perception strongly engages the motor planning system, even when the listener is not moving or preparing to move. One current hypothesis is that the motor planning system is actively involved in predicting the timing of beats, and communicates these predictions to the auditory system via tight two-way signaling between these brain regions. Here, we propose to take advantage of the well-described auditory-motor circuits in vocal learning songbirds and to leverage the mechanistic studies possible in an animal model to investigate the neural mechanisms of beat perception. Like humans (and unlike non-human primates), vocal learning birds have strong connections between motor planning regions and auditory regions due to their reliance on complex, learned vocal sequences for communication. Auditory-motor circuits in songbirds and humans have many structural and functional parallels. In addition, like humans, songbirds can learn to anticipate the timing of rhythmic vocalizations of a partner and adjust the timing of their own vocalizations to avoid overlapping signals, suggesting that they can detect and exploit temporal regularity to predict upcoming sensory events. The proposed experiments will first test the ability of songbirds to recognize temporal regularity in beat-based auditory rhythms, and then test the hypothesis that motor signals are necessary for beat-based perception. These experiments lay the groundwork for future electrophysiological studies of auditory-motor interactions in beat perception. Establishing an animal model for beat perception will be transformative for music neuroscience, allowing mechanistic investigation of the neural code underlying temporal prediction and informing rhythm-based musical interventions to enhance function in normal and disease states.

For more information on this project, see their NIH Research Portfolio.