Musical Emotion Categorization with Vocoders of Varying Temporal and Spectral Content

Eleanor E Harding*, Etienne Gaudrain, Imke J Hrycyk, Robert L Harris, Barbara Tillmann, Bert Maat, Rolien H Free, Deniz Başkent

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

3 Citations (Scopus)
72 Downloads (Pure)

Abstract

While previous research investigating music emotion perception of cochlear implant (CI) users observed that temporal cues informing tempo largely convey emotional arousal (relaxing/stimulating), it remains unclear how other properties of the temporal content may contribute to the transmission of arousal features. Moreover, while detailed spectral information related to pitch and harmony in music - often not well perceived by CI users- reportedly conveys emotional valence (positive, negative), it remains unclear how the quality of spectral content contributes to valence perception. Therefore, the current study used vocoders to vary temporal and spectral content of music and tested music emotion categorization (joy, fear, serenity, sadness) in 23 normal-hearing participants. Vocoders were varied with two carriers (sinewave or noise; primarily modulating temporal information), and two filter orders (low or high; primarily modulating spectral information). Results indicated that emotion categorization was above-chance in vocoded excerpts but poorer than in a non-vocoded control condition. Among vocoded conditions, better temporal content (sinewave carriers) improved emotion categorization with a large effect while better spectral content (high filter order) improved it with a small effect. Arousal features were comparably transmitted in non-vocoded and vocoded conditions, indicating that lower temporal content successfully conveyed emotional arousal. Valence feature transmission steeply declined in vocoded conditions, revealing that valence perception was difficult for both lower and higher spectral content. The reliance on arousal information for emotion categorization of vocoded music suggests that efforts to refine temporal cues in the CI user signal may immediately benefit their music emotion perception.

Original languageEnglish
Pages (from-to)1-19
Number of pages19
JournalTrends in hearing
Volume27
DOIs
Publication statusPublished - Jan-2023

Keywords

  • Humans
  • Auditory Perception
  • Music
  • Cochlear Implants
  • Cochlear Implantation
  • Emotions

Fingerprint

Dive into the research topics of 'Musical Emotion Categorization with Vocoders of Varying Temporal and Spectral Content'. Together they form a unique fingerprint.

Cite this