Zhang and colleagues’ research confirms that humans communicate on multiple “channels” simultaneously, some of which are challenging to replicate in virtual situations. The investigators report that they “presented video-clips of an actress producing naturalistic passages to participants while recording their electroencephalogram. We quantified multimodal cues (prosody, gestures, mouth movements) and measured their effect on a well-established electroencephalographic marker of processing load in comprehension (N400). We found that brain responses to words were affected by informativeness of co-occurring multimodal cues, indicating that comprehension relies on linguistic and non-linguistic cues. Moreover, they were affected by interactions between the multimodal cues, indicating that the impact of each cue dynamically changes based on the informativeness of other cues. Thus, results show that multimodal cues are integral to comprehension.”
Ye Zhang, Diego Frassinelli, Jyrki Tuomainen, Jeremy Skipper, and Gabriella Vigliocco. 2021. “More Than Words: Word Predictability, Prosody, Gesture and Mouth Movements in Natural Language Comprehension.” Proceedings of the Royal Society B, vol. 288, no. 1955, https://doi.org/10.1098/rspb.2021.0500