ASHA journals
Browse
1/1
12 files

Gender differences in emotion Stroop tasks (Lin et al., 2021)

dataset
posted on 2021-09-22, 17:41 authored by Yi Lin, Hongwei Ding, Yang Zhang
Purpose: This study aimed to examine the Stroop effects of verbal and nonverbal cues and their relative impacts on gender differences in unisensory and multisensory emotion perception.
Method: Experiment 1 investigated how well 88 normal Chinese adults (43 women and 45 men) could identify emotions conveyed through face, prosody and semantics as three independent channels. Experiments 2 and 3 further explored gender differences during multisensory integration of emotion through a cross-channel (prosody-semantics) and a cross-modal (face-prosody-semantics) Stroop task, respectively, in which 78 participants (41 women and 37 men) were asked to selectively attend to one of the two or three communication channels.
Results: The integration of accuracy and reaction time data indicated that paralinguistic cues (i.e., face and prosody) of emotions were consistently more salient than linguistic ones (i.e., semantics) throughout the study. Additionally, women demonstrated advantages in processing all three types of emotional signals in the unisensory task, but only preserved their strengths in paralinguistic processing and showed greater Stroop effects of nonverbal cues on verbal ones during multisensory perception.
Conclusions: These findings demonstrate clear gender differences in verbal and nonverbal emotion perception that are modulated by sensory channels, which have important theoretical and practical implications.

Supplemental Material S1. The full models with intercepts, coefficients, and error terms for accuracy and reaction time analyses in Experiments 1, 2, and 3.

Supplemental Table S1. Words for semantic stimuli in Experiment 1.

Supplemental Table S2. Words for prosodic stimuli in Experiment 1.

Supplemental Table S3. Generalized linear mixed-effects model with gender and task as the fixed effects, and accuracy as the dependent variable for target emotions (happiness and sadness) in Experiment 1 (pairwise contrasts are indented).

Supplemental Table S4. Generalized linear mixed-effects model with gender and task as the fixed effects, and reaction time as the dependent variable for target emotions (happiness and sadness) in Experiment 1 (pairwise contrasts are indented).

Supplemental Table S5. Summary of generalized linear mixed-effects models with both target (happiness and sadness) and filler (anger and neutrality) trials in Experiment 1 (pairwise contrasts are indented).

Supplemental Table S6. Spoken stimuli adopted in Experiments 2 and 3.

Supplemental Table S7. Generalized linear mixed-effects model with gender, task, and congruence as the fixed effects, and accuracy as the dependent variable in Experiment 2 (pairwise contrasts are indented).

Supplemental Table S8. Generalized linear mixed-effects model with gender, task, and congruence as the fixed effects, and reaction time as the dependent variable in Experiment 2 (pairwise contrasts are indented).

Supplemental Table S9. Generalized linear mixed-effects model with gender, task, and congruence as the fixed effects, and accuracy as the dependent variable in Experiment 3 (pairwise contrasts are indented).

Supplemental Table S10. Generalized linear mixed effects model with gender, task, and congruence as the fixed effects, and reaction time as the dependent variable in Experiment 3 (pairwise contrasts are indented).

Supplemental Table S11. Summary of within-participant analyses in Experiment 2 and 3.

Lin, Y., Ding, H., & Zhang, Y. (2021). Unisensory and multisensory Stroop effects modulate gender differences in verbal and nonverbal emotion perception. Journal of Speech, Language, and Hearing Research. Advance online publication. https://doi.org/10.1044/2021_JSLHR-20-00338

Funding

H. Ding and Y. Zhang were supported by the Major Program of National Social Science Foundation of China (No. 18ZDA293). Y. Zhang received additional support from University of Minnesota’s Grand Challenges Exploratory Research Grant.

History