Accurate Decoding of Imagined and Heard Melodies.

TitleAccurate Decoding of Imagined and Heard Melodies.
Publication TypeJournal Article
Year of Publication2021
AuthorsDi Liberto GM, Marion G, Shamma SA
JournalFront Neurosci
Volume15
Pagination673401
Date Published2021
ISSN1662-4548
Abstract

Music perception requires the human brain to process a variety of acoustic and music-related properties. Recent research used encoding models to tease apart and study the various cortical contributors to music perception. To do so, such approaches study temporal response functions that summarise the neural activity over several minutes of data. Here we tested the possibility of assessing the neural processing of individual musical units (bars) with electroencephalography (EEG). We devised a decoding methodology based on a maximum correlation metric across EEG segments () and used it to decode melodies from EEG based on an experiment where professional musicians listened and imagined four Bach melodies multiple times. We demonstrate here that accurate decoding of melodies in single-subjects and at the level of individual musical units is possible, both from EEG signals recorded during listening and imagination. Furthermore, we find that greater decoding accuracies are measured for the method than for an envelope reconstruction approach based on backward temporal response functions ( ). These results indicate that low-frequency neural signals encode information beyond note timing, especially with respect to low-frequency cortical signals below 1 Hz, which are shown to encode pitch-related information. Along with the theoretical implications of these results, we discuss the potential applications of this decoding methodology in the context of novel brain-computer interface solutions.

DOI10.3389/fnins.2021.673401
Alternate JournalFront Neurosci
PubMed ID34421512
PubMed Central IDPMC8375770