Please use this identifier to cite or link to this item:
Title: A Whole Brain EEG Analysis of Musicianship
Issue Date: 5-Sep-2019
Abstract: The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.
ISSN: 0730-7829
Citation: RIBEIRO, E.; THOMAZ, C. E. A Whole brain EEG analysis of musicianship. MUSIC PERCEPTION, v. 37, n. 1, p. 42-56, 2019.
Access Type: Acesso Restrito
DOI: 10.1525/mp.2019.37.1.42
Appears in Collections:Artigos

Files in This Item:
There are no files associated with this item.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.