Skip to main content
Top
Published in: Brain Topography 3/2015

01-05-2015 | Original Paper

Brain Prediction of Auditory Emphasis by Facial Expressions During Audiovisual Continuous Speech

Authors: Kuzma Strelnikov, Jessica Foxton, Mathieu Marx, Pascal Barone

Published in: Brain Topography | Issue 3/2015

Login to get access

Abstract

The visual cues involved in auditory speech processing are not restricted to information from lip movements but also include head or chin gestures and facial expressions such as eyebrow movements. The fact that visual gestures precede the auditory signal implicates that visual information may influence the auditory activity. As visual stimuli are very close in time to the auditory information for audiovisual syllables, the cortical response to them usually overlaps with that for the auditory stimulation; the neural dynamics underlying the visual facilitation for continuous speech therefore remain unclear. In this study, we used a three-word phrase to study continuous speech processing. We presented video clips with even (without emphasis) phrases as the frequent stimuli and with one word visually emphasized by the speaker as the non-frequent stimuli. Negativity in the resulting ERPs was detected after the start of the emphasizing articulatory movements but before the auditory stimulus, a finding that was confirmed by the statistical comparisons of the audiovisual and visual stimulation. No such negativity was present in the control visual-only condition. The propagation of this negativity was observed between the visual and fronto-temporal electrodes. Thus, in continuous speech, the visual modality evokes predictive coding for the auditory speech, which is analysed by the cerebral cortex in the context of the phrase even before the arrival of the corresponding auditory signal.
Appendix
Available only for authorised users
Literature
go back to reference Arnal LH, Morillon B, Kell CA, Giraud AL (2009) Dual neural routing of visual facilitation in speech processing. J Neurosci 29(43):13445–13453CrossRefPubMed Arnal LH, Morillon B, Kell CA, Giraud AL (2009) Dual neural routing of visual facilitation in speech processing. J Neurosci 29(43):13445–13453CrossRefPubMed
go back to reference Arnal LH, Wyart V, Giraud AL (2011) Transitions in neural oscillations reflect prediction errors generated in audiovisual speech. Nat Neurosci 14(6):797–801CrossRefPubMed Arnal LH, Wyart V, Giraud AL (2011) Transitions in neural oscillations reflect prediction errors generated in audiovisual speech. Nat Neurosci 14(6):797–801CrossRefPubMed
go back to reference Barker J, Berthommier F (1999) Evidence of correlation between acoustic and visual features of speech. In: Ohala JJ, Hasegawa Y, Ohala M, Granville D, Bailey AC (eds) 14th international congress of phonetic sciences, San Francisco, USA, 1999. the congress organizers at the Linguistics Department, University of California, Berkeley, pp 199–202 Barker J, Berthommier F (1999) Evidence of correlation between acoustic and visual features of speech. In: Ohala JJ, Hasegawa Y, Ohala M, Granville D, Bailey AC (eds) 14th international congress of phonetic sciences, San Francisco, USA, 1999. the congress organizers at the Linguistics Department, University of California, Berkeley, pp 199–202
go back to reference Barkhuysen P, Krahmer E, Swerts M (2008) The interplay between the auditory and visual modality for end-of-utterance detection. J Acoust Soc Am 123(1):354–365CrossRefPubMed Barkhuysen P, Krahmer E, Swerts M (2008) The interplay between the auditory and visual modality for end-of-utterance detection. J Acoust Soc Am 123(1):354–365CrossRefPubMed
go back to reference Barone P, Deguine O (2011) Multisensory processing in cochlear implant listeners. In: Zeng FG, Fay R, Popper A (eds) Springer handbook of auditory research. auditory prostheses: cochlear implants and beyond. Springer, New York, pp 365–382 Barone P, Deguine O (2011) Multisensory processing in cochlear implant listeners. In: Zeng FG, Fay R, Popper A (eds) Springer handbook of auditory research. auditory prostheses: cochlear implants and beyond. Springer, New York, pp 365–382
go back to reference Besle J, Fort A, Delpuech C, Giard MH (2004) Bimodal speech: early suppressive visual effects in human auditory cortex. Eur J Neurosci 20(8):2225–2234CrossRefPubMedCentralPubMed Besle J, Fort A, Delpuech C, Giard MH (2004) Bimodal speech: early suppressive visual effects in human auditory cortex. Eur J Neurosci 20(8):2225–2234CrossRefPubMedCentralPubMed
go back to reference Besle J, Fischer C, Bidet-Caulet A, Lecaignard F, Bertrand O, Giard MH (2008) Visual activation and audiovisual interactions in the auditory cortex during speech perception: intracranial recordings in humans. J Neurosci 28(52):14301–14310CrossRefPubMed Besle J, Fischer C, Bidet-Caulet A, Lecaignard F, Bertrand O, Giard MH (2008) Visual activation and audiovisual interactions in the auditory cortex during speech perception: intracranial recordings in humans. J Neurosci 28(52):14301–14310CrossRefPubMed
go back to reference Besle J, Bertrand O, Giard MH (2009) Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex. Hear Res 258(1–2):143–151CrossRefPubMed Besle J, Bertrand O, Giard MH (2009) Electrophysiological (EEG, sEEG, MEG) evidence for multiple audiovisual interactions in the human auditory cortex. Hear Res 258(1–2):143–151CrossRefPubMed
go back to reference Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS (1997) Activation of auditory cortex during silent lipreading. Science 276(5312):593–596CrossRefPubMed Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SC, McGuire PK, Woodruff PW, Iversen SD, David AS (1997) Activation of auditory cortex during silent lipreading. Science 276(5312):593–596CrossRefPubMed
go back to reference Campanella S, Gaspard C, Debatisse D, Bruyer R, Crommelinck M, Guerit JM (2002) Discrimination of emotional facial expressions in a visual oddball task: an ERP study. Biol Psychol 59(3):171–186CrossRefPubMed Campanella S, Gaspard C, Debatisse D, Bruyer R, Crommelinck M, Guerit JM (2002) Discrimination of emotional facial expressions in a visual oddball task: an ERP study. Biol Psychol 59(3):171–186CrossRefPubMed
go back to reference Cappe C, Thut G, Romei V, Murray MM (2010) Auditory-visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci 30(38):12572–12580CrossRefPubMed Cappe C, Thut G, Romei V, Murray MM (2010) Auditory-visual multisensory interactions in humans: timing, topography, directionality, and sources. J Neurosci 30(38):12572–12580CrossRefPubMed
go back to reference Carpenter J, Bithell J (2000) Bootstrap confidence intervals: when, which, what? a practical guide for medical statisticians. Stat Med 19(9):1141–1164CrossRefPubMed Carpenter J, Bithell J (2000) Bootstrap confidence intervals: when, which, what? a practical guide for medical statisticians. Stat Med 19(9):1141–1164CrossRefPubMed
go back to reference Cavé C, Guaïtella I, Bertrand R, Santi S, Harlay F (1996) Espesser R about the relationship between eyebrow movements and Fo variations. ICSLP, Philadelphia, pp 2175–2178 Cavé C, Guaïtella I, Bertrand R, Santi S, Harlay F (1996) Espesser R about the relationship between eyebrow movements and Fo variations. ICSLP, Philadelphia, pp 2175–2178
go back to reference Chandrasekaran C, Trubanova A, Stillittano S, Caplier A, Ghazanfar AA (2009) The natural statistics of audiovisual speech. PLoS Comput Biol 5(7):e1000436CrossRefPubMedCentralPubMed Chandrasekaran C, Trubanova A, Stillittano S, Caplier A, Ghazanfar AA (2009) The natural statistics of audiovisual speech. PLoS Comput Biol 5(7):e1000436CrossRefPubMedCentralPubMed
go back to reference Chatterjee M, Peng SC (2008) Processing F0 with cochlear implants: modulation frequency discrimination and speech intonation recognition. Hear Res 235(1–2):143–156CrossRefPubMedCentralPubMed Chatterjee M, Peng SC (2008) Processing F0 with cochlear implants: modulation frequency discrimination and speech intonation recognition. Hear Res 235(1–2):143–156CrossRefPubMedCentralPubMed
go back to reference Colin C, Radeau M, Soquet A, Demolin D, Colin F, Deltenre P (2002) Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory. Clin Neurophysiol 113(4):495–506CrossRefPubMed Colin C, Radeau M, Soquet A, Demolin D, Colin F, Deltenre P (2002) Mismatch negativity evoked by the McGurk–MacDonald effect: a phonetic representation within short-term memory. Clin Neurophysiol 113(4):495–506CrossRefPubMed
go back to reference Colombo L, Deguchi C, Boureux M, Sarlo M, Besson M (2011) Detection of pitch violations depends upon the familiarity of intonational contour of sentences. Cortex 47(5):557–568CrossRefPubMed Colombo L, Deguchi C, Boureux M, Sarlo M, Besson M (2011) Detection of pitch violations depends upon the familiarity of intonational contour of sentences. Cortex 47(5):557–568CrossRefPubMed
go back to reference Davis C, Kislyuk D, Kim J, Sams M (2008) The effect of viewing speech on auditory speech processing is different in the left and right hemispheres. Brain Res 1242:151–161CrossRefPubMed Davis C, Kislyuk D, Kim J, Sams M (2008) The effect of viewing speech on auditory speech processing is different in the left and right hemispheres. Brain Res 1242:151–161CrossRefPubMed
go back to reference de Gelder B, Bocker KB, Tuomainen J, Hensen M, Vroomen J (1999) The combined perception of emotion from voice and face: early interaction revealed by human electric brain responses. Neurosci Lett 260(2):133–136CrossRefPubMed de Gelder B, Bocker KB, Tuomainen J, Hensen M, Vroomen J (1999) The combined perception of emotion from voice and face: early interaction revealed by human electric brain responses. Neurosci Lett 260(2):133–136CrossRefPubMed
go back to reference Donnelly PJ, Guo BZ, Limb CJ (2009) Perceptual fusion of polyphonic pitch in cochlear implant users. J Acoust Soc Am 126(5):EL128–EL133CrossRefPubMed Donnelly PJ, Guo BZ, Limb CJ (2009) Perceptual fusion of polyphonic pitch in cochlear implant users. J Acoust Soc Am 126(5):EL128–EL133CrossRefPubMed
go back to reference Foxton JM, Riviere LD, Barone P (2010) Cross-modal facilitation in speech prosody. Cognition 115(1):71–78CrossRefPubMed Foxton JM, Riviere LD, Barone P (2010) Cross-modal facilitation in speech prosody. Cognition 115(1):71–78CrossRefPubMed
go back to reference Ghazanfar AA, Chandrasekaran C, Logothetis NK (2008) Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys. J Neurosci 28(17):4457–4469CrossRefPubMedCentralPubMed Ghazanfar AA, Chandrasekaran C, Logothetis NK (2008) Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys. J Neurosci 28(17):4457–4469CrossRefPubMedCentralPubMed
go back to reference Guaitella I, Santi S, Lagrue B, Cave C (2009) Are eyebrow movements linked to voice variations and turn-taking in dialogue? an experimental investigation. Lang Speech 52(Pt 2–3):207–222CrossRefPubMed Guaitella I, Santi S, Lagrue B, Cave C (2009) Are eyebrow movements linked to voice variations and turn-taking in dialogue? an experimental investigation. Lang Speech 52(Pt 2–3):207–222CrossRefPubMed
go back to reference Hadar U, Steiner TJ, Rose FC (1984) Involvement of head movement in speech production and its implications for language pathology. Adv Neurol 42:247–261PubMed Hadar U, Steiner TJ, Rose FC (1984) Involvement of head movement in speech production and its implications for language pathology. Adv Neurol 42:247–261PubMed
go back to reference Halgren E, Baudena P, Clarke JM, Heit G, Marinkovic K, Devaux B, Vignal JP, Biraben A (1995) Intracerebral potentials to rare target and distractor auditory and visual stimuli. II. medial, lateral and posterior temporal lobe. Electroencephalogr Clin Neurophysiol 94(4):229–250CrossRefPubMed Halgren E, Baudena P, Clarke JM, Heit G, Marinkovic K, Devaux B, Vignal JP, Biraben A (1995) Intracerebral potentials to rare target and distractor auditory and visual stimuli. II. medial, lateral and posterior temporal lobe. Electroencephalogr Clin Neurophysiol 94(4):229–250CrossRefPubMed
go back to reference Hertrich I, Mathiak K, Lutzenberger W, Menning H, Ackermann H (2007) Sequential audiovisual interactions during speech perception: a whole-head MEG study. Neuropsychologia 45(6):1342–1354CrossRefPubMed Hertrich I, Mathiak K, Lutzenberger W, Menning H, Ackermann H (2007) Sequential audiovisual interactions during speech perception: a whole-head MEG study. Neuropsychologia 45(6):1342–1354CrossRefPubMed
go back to reference Jiang J, Alwan A, Keating PA, Auer ET, Bernstein LE (2002) On the relationship between face movements, tongue movements and speech acoustics. EURASIP J Appl Signal Process 11:1174–1188CrossRef Jiang J, Alwan A, Keating PA, Auer ET, Bernstein LE (2002) On the relationship between face movements, tongue movements and speech acoustics. EURASIP J Appl Signal Process 11:1174–1188CrossRef
go back to reference Kislyuk DS, Mottonen R, Sams M (2008) Visual processing affects the neural basis of auditory discrimination. J Cogn Neurosci 20(12):2175–2184CrossRefPubMed Kislyuk DS, Mottonen R, Sams M (2008) Visual processing affects the neural basis of auditory discrimination. J Cogn Neurosci 20(12):2175–2184CrossRefPubMed
go back to reference Lakatos P, Chen CM, O’Connell MN, Mills A, Schroeder CE (2007) Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53(2):279–292CrossRefPubMedCentralPubMed Lakatos P, Chen CM, O’Connell MN, Mills A, Schroeder CE (2007) Neuronal oscillations and multisensory interaction in primary auditory cortex. Neuron 53(2):279–292CrossRefPubMedCentralPubMed
go back to reference Li X, Yang Y, Ren G (2009) Immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention: a mismatch negativity study. Neuroscience 161(1):59–66CrossRefPubMed Li X, Yang Y, Ren G (2009) Immediate integration of prosodic information from speech and visual information from pictures in the absence of focused attention: a mismatch negativity study. Neuroscience 161(1):59–66CrossRefPubMed
go back to reference Maris E, Oostenveld R (2007) Nonparametric statistical testing of EEG- and MEG-data. J Neurosci Methods 164(1):177–190CrossRefPubMed Maris E, Oostenveld R (2007) Nonparametric statistical testing of EEG- and MEG-data. J Neurosci Methods 164(1):177–190CrossRefPubMed
go back to reference Marx M, James C, Foxton J, Capber A, Fraysse B, Barone P, Deguine O (2013) Prosodic cues in cochlear implant users. In: 20th IFOS world congress, Seoul, June 2013 Marx M, James C, Foxton J, Capber A, Fraysse B, Barone P, Deguine O (2013) Prosodic cues in cochlear implant users. In: 20th IFOS world congress, Seoul, June 2013
go back to reference Mottonen R, Krause CM, Tiippana K, Sams M (2002) Processing of changes in visual speech in the human auditory cortex. Brain Res Cogn Brain Res 13(3):417–425CrossRefPubMed Mottonen R, Krause CM, Tiippana K, Sams M (2002) Processing of changes in visual speech in the human auditory cortex. Brain Res Cogn Brain Res 13(3):417–425CrossRefPubMed
go back to reference Munhall KG, Jones JA, Callan DE, Kuratate T, Vatikiotis-Bateson E (2004) Visual prosody and speech intelligibility: head movement improves auditory speech perception. Psychol Sci 15(2):133–137CrossRefPubMed Munhall KG, Jones JA, Callan DE, Kuratate T, Vatikiotis-Bateson E (2004) Visual prosody and speech intelligibility: head movement improves auditory speech perception. Psychol Sci 15(2):133–137CrossRefPubMed
go back to reference Naatanen R, Paavilainen P, Rinne T, Alho K (2007) The mismatch negativity (MMN) in basic research of central auditory processing: a review. Clin Neurophysiol 118(12):2544–2590CrossRefPubMed Naatanen R, Paavilainen P, Rinne T, Alho K (2007) The mismatch negativity (MMN) in basic research of central auditory processing: a review. Clin Neurophysiol 118(12):2544–2590CrossRefPubMed
go back to reference Proverbio AM, Riva F (2009) RP and N400 ERP components reflect semantic violations in visual processing of human actions. Neurosci Lett 459(3):142–146CrossRefPubMed Proverbio AM, Riva F (2009) RP and N400 ERP components reflect semantic violations in visual processing of human actions. Neurosci Lett 459(3):142–146CrossRefPubMed
go back to reference Quian Quiroga R, Garcia H (2003) Single-trial event-related potentials with wavelet denoising. Clin Neurophysiol 114(2):376–390CrossRefPubMed Quian Quiroga R, Garcia H (2003) Single-trial event-related potentials with wavelet denoising. Clin Neurophysiol 114(2):376–390CrossRefPubMed
go back to reference Reale RA, Calvert GA, Thesen T, Jenison RL, Kawasaki H, Oya H, Howard MA, Brugge JF (2007) Auditory-visual processing represented in the human superior temporal gyrus. Neuroscience 145(1):162–184CrossRefPubMed Reale RA, Calvert GA, Thesen T, Jenison RL, Kawasaki H, Oya H, Howard MA, Brugge JF (2007) Auditory-visual processing represented in the human superior temporal gyrus. Neuroscience 145(1):162–184CrossRefPubMed
go back to reference Ross LA, Saint-Amour D, Leavitt VM, Javitt DC, Foxe JJ (2007) Do you see what I am saying? exploring visual enhancement of speech comprehension in noisy environments. Cereb Cortex 17(5):1147–1153CrossRefPubMed Ross LA, Saint-Amour D, Leavitt VM, Javitt DC, Foxe JJ (2007) Do you see what I am saying? exploring visual enhancement of speech comprehension in noisy environments. Cereb Cortex 17(5):1147–1153CrossRefPubMed
go back to reference Saint-Amour D, De Sanctis P, Molholm S, Ritter W, Foxe JJ (2007) Seeing voices: high-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion. Neuropsychologia 45(3):587–597CrossRefPubMedCentralPubMed Saint-Amour D, De Sanctis P, Molholm S, Ritter W, Foxe JJ (2007) Seeing voices: high-density electrical mapping and source-analysis of the multisensory mismatch negativity evoked during the McGurk illusion. Neuropsychologia 45(3):587–597CrossRefPubMedCentralPubMed
go back to reference Sams M, Alho K, Naatanen R (1984) Short-term habituation and dishabituation of the mismatch negativity of the ERP. Psychophysiology 21(4):434–441CrossRefPubMed Sams M, Alho K, Naatanen R (1984) Short-term habituation and dishabituation of the mismatch negativity of the ERP. Psychophysiology 21(4):434–441CrossRefPubMed
go back to reference Sams M, Aulanko R, Hamalainen M, Hari R, Lounasmaa OV, Lu ST, Simola J (1991) Seeing speech: visual information from lip movements modifies activity in the human auditory cortex. Neurosci Lett 127(1):141–145CrossRefPubMed Sams M, Aulanko R, Hamalainen M, Hari R, Lounasmaa OV, Lu ST, Simola J (1991) Seeing speech: visual information from lip movements modifies activity in the human auditory cortex. Neurosci Lett 127(1):141–145CrossRefPubMed
go back to reference Scarborough R, Keating P, Mattys SL, Cho T, Alwan A (2009) Optical phonetics and visual perception of lexical and phrasal stress in English. Lang Speech 52(Pt 2–3):135–175CrossRefPubMed Scarborough R, Keating P, Mattys SL, Cho T, Alwan A (2009) Optical phonetics and visual perception of lexical and phrasal stress in English. Lang Speech 52(Pt 2–3):135–175CrossRefPubMed
go back to reference Schroeder CE, Foxe J (2005) Multisensory contributions to low-level, ‘unisensory’ processing. Curr Opin Neurobiol 15(4):454–458CrossRefPubMed Schroeder CE, Foxe J (2005) Multisensory contributions to low-level, ‘unisensory’ processing. Curr Opin Neurobiol 15(4):454–458CrossRefPubMed
go back to reference Schwartz JL, Savariaux C (2013) Data and simulations about audiovisual asynchrony and predictability in speech perception. the 12th international conference on auditory-visual speech processing, Annecy, France, 2013 Schwartz JL, Savariaux C (2013) Data and simulations about audiovisual asynchrony and predictability in speech perception. the 12th international conference on auditory-visual speech processing, Annecy, France, 2013
go back to reference Schwartz JL, Berthommier F, Savariaux C (2004) Seeing to hear better: evidence for early audio-visual interactions in speech identification. Cognition 93(2):B69–B78CrossRefPubMed Schwartz JL, Berthommier F, Savariaux C (2004) Seeing to hear better: evidence for early audio-visual interactions in speech identification. Cognition 93(2):B69–B78CrossRefPubMed
go back to reference Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19(12):1964–1973CrossRefPubMed Stekelenburg JJ, Vroomen J (2007) Neural correlates of multisensory integration of ecologically valid audiovisual events. J Cogn Neurosci 19(12):1964–1973CrossRefPubMed
go back to reference Strelnikov K (2007) Can mismatch negativity be linked to synaptic processes? a glutamatergic approach to deviance detection. Brain Cogn 65(3):244–251CrossRefPubMed Strelnikov K (2007) Can mismatch negativity be linked to synaptic processes? a glutamatergic approach to deviance detection. Brain Cogn 65(3):244–251CrossRefPubMed
go back to reference Strelnikov K (2008) Activation-verification in continuous speech processing. interaction of cognitive strategies as a possible theoretical approach. J Neurolinguist 21:1–17CrossRef Strelnikov K (2008) Activation-verification in continuous speech processing. interaction of cognitive strategies as a possible theoretical approach. J Neurolinguist 21:1–17CrossRef
go back to reference Strelnikov K (2010) Neuroimaging and neuroenergetics: brain activations as information-driven reorganization of energy flows. Brain Cogn 72(3):449–456CrossRefPubMed Strelnikov K (2010) Neuroimaging and neuroenergetics: brain activations as information-driven reorganization of energy flows. Brain Cogn 72(3):449–456CrossRefPubMed
go back to reference Sumby WH, Pollack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26(2):212–215CrossRef Sumby WH, Pollack I (1954) Visual contribution to speech intelligibility in noise. J Acoust Soc Am 26(2):212–215CrossRef
go back to reference Tales A, Newton P, Troscianko T, Butler S (1999) Mismatch negativity in the visual modality. Neuroreport 10(16):3363–3367CrossRefPubMed Tales A, Newton P, Troscianko T, Butler S (1999) Mismatch negativity in the visual modality. Neuroreport 10(16):3363–3367CrossRefPubMed
go back to reference Ullsperger P, Erdmann U, Freude G, Dehoff W (2006) When sound and picture do not fit: mismatch negativity and sensory interaction. Int J Psychophysiol 59(1):3–7CrossRefPubMed Ullsperger P, Erdmann U, Freude G, Dehoff W (2006) When sound and picture do not fit: mismatch negativity and sensory interaction. Int J Psychophysiol 59(1):3–7CrossRefPubMed
go back to reference van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102(4):1181–1186CrossRefPubMedCentralPubMed van Wassenhove V, Grant KW, Poeppel D (2005) Visual speech speeds up the neural processing of auditory speech. Proc Natl Acad Sci USA 102(4):1181–1186CrossRefPubMedCentralPubMed
go back to reference Vatikiotis-Bateson E, Yehia HC (2002) Speaking mode variability in multimodal speech production. IEEE Trans Neural Networks 13(4):894–899CrossRef Vatikiotis-Bateson E, Yehia HC (2002) Speaking mode variability in multimodal speech production. IEEE Trans Neural Networks 13(4):894–899CrossRef
Metadata
Title
Brain Prediction of Auditory Emphasis by Facial Expressions During Audiovisual Continuous Speech
Authors
Kuzma Strelnikov
Jessica Foxton
Mathieu Marx
Pascal Barone
Publication date
01-05-2015
Publisher
Springer US
Published in
Brain Topography / Issue 3/2015
Print ISSN: 0896-0267
Electronic ISSN: 1573-6792
DOI
https://doi.org/10.1007/s10548-013-0338-2

Other articles of this Issue 3/2015

Brain Topography 3/2015 Go to the issue