ASHA journals
Browse

Audiovisual metrics of affect production in autism (Kothare et al., 2024)

Download (592.97 kB)
online resource
posted on 2024-12-19, 21:04 authored by Hardik Kothare, Vikram Ramanarayanan, Michael Neumann, Jackson Liscombe, Vanessa Richter, Linnea Lampinen, Alison Bai, Cristian Preciado, Katherine Brogan, Carly Demopoulos

Purpose: We investigate the extent to which automated audiovisual metrics extracted during an affect production task show statistically significant differences between a cohort of children diagnosed with autism spectrum disorder (ASD) and typically developing controls.

Method: Forty children with ASD and 21 neurotypical controls interacted with a multimodal conversational platform with a virtual agent, Tina, who guided them through tasks prompting facial and vocal communication of four emotions—happy, angry, sad, and afraid—under conditions of high and low verbal and social cognitive task demands.

Results: Individuals with ASD exhibited greater standard deviation of the fundamental frequency of the voice with the minima and maxima of the pitch contour occurring at an earlier time point as compared to controls. The intensity and voice quality of emotional speech were also different between the two cohorts in certain conditions. Additionally, facial metrics capturing the acceleration of the lower lip, lip width, eye opening, and vertical displacement of the eyebrows were also important markers to distinguish between children with ASD and neurotypical controls. Both facial and speech metrics performed well above chance in group classification accuracy.

Conclusion: Speech acoustic and facial metrics associated with affect production were effective in distinguishing between children with ASD and neurotypical controls.

Supplemental Material S1. Glossary of metrics extracted by the Modality platform.

Kothare, H., Ramanarayanan, V., Neumann, M., Liscombe, J., Richter, V., Lampinen, L., Bai, A., Preciado, C., Brogan, K., & Demopoulos, C. (2024). Vocal and facial behavior during affect production in autism spectrum disorder. Journal of Speech, Language, and Hearing Research. Advance online publication. https://doi.org/10.1044/2024_JSLHR-23-00080

Funding

This study was supported by National Institutes of Health Grants K23 DC016637 and R01DC019167, Autism Speaks Grant 11637, and UCSF Weill Institute for Neuroscience Weill Award for Clinical Neuroscience Research awarded to Carly Demopoulos.

History