Audiovisual speech recognition : correspondence between brain and behavior / / topic editor Nicholas Altieri.

Perceptual processes mediating recognition, including the recognition of objects and spoken words, is inherently multisensory. This is true in spite of the fact that sensory inputs are segregated in early stages of neuro-sensory encoding. In face-to-face communication, for example, auditory informat...

Full description

Saved in:
Bibliographic Details
:
TeilnehmendeR:
Place / Publishing House:Lausanne, Switzerland : : Frontiers Media SA,, 2014.
©2014
Year of Publication:2014
Language:English
Series:Frontiers Research Topics
Physical Description:1 online resource (101 pages) :; illustrations, charts; digital, PDF file(s).
Notes:
  • Bibliographic Level Mode of Issuance: Monograph
  • Published in Frontiers in Psychology.
Tags: Add Tag
No Tags, Be the first to tag this record!
id 993547287204498
ctrlnum (CKB)3710000000504555
(SSID)ssj0001664972
(PQKBManifestationID)16454261
(PQKBTitleCode)TC0001664972
(PQKBWorkID)14999642
(PQKB)11704633
(WaSeSS)IndRDA00055884
(oapen)https://directory.doabooks.org/handle/20.500.12854/41553
(EXLCZ)993710000000504555
collection bib_alma
record_format marc
spelling Nicholas Altieri auth
Audiovisual speech recognition [electronic resource] : correspondence between brain and behavior / topic editor Nicholas Altieri.
Frontiers Media SA 2014
Lausanne, Switzerland : Frontiers Media SA, 2014.
©2014
1 online resource (101 pages) : illustrations, charts; digital, PDF file(s).
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
text file rda
Frontiers Research Topics
Bibliographic Level Mode of Issuance: Monograph
Published in Frontiers in Psychology.
Includes bibliographical references.
Perceptual processes mediating recognition, including the recognition of objects and spoken words, is inherently multisensory. This is true in spite of the fact that sensory inputs are segregated in early stages of neuro-sensory encoding. In face-to-face communication, for example, auditory information is processed in the cochlea, encoded in auditory sensory nerve, and processed in lower cortical areas. Eventually, these “sounds” are processed in higher cortical pathways such as the auditory cortex where it is perceived as speech. Likewise, visual information obtained from observing a talker’s articulators is encoded in lower visual pathways. Subsequently, this information undergoes processing in the visual cortex prior to the extraction of articulatory gestures in higher cortical areas associated with speech and language. As language perception unfolds, information garnered from visual articulators interacts with language processing in multiple brain regions. This occurs via visual projections to auditory, language, and multisensory brain regions. The association of auditory and visual speech signals makes the speech signal a highly “configural” percept. An important direction for the field is thus to provide ways to measure the extent to which visual speech information influences auditory processing, and likewise, assess how the unisensory components of the signal combine to form a configural/integrated percept. Numerous behavioral measures such as accuracy (e.g., percent correct, susceptibility to the “McGurk Effect”) and reaction time (RT) have been employed to assess multisensory integration ability in speech perception. On the other hand, neural based measures such as fMRI, EEG and MEG have been employed to examine the locus and or time-course of integration. The purpose of this Research Topic is to find converging behavioral and neural based assessments of audiovisual integration in speech perception. A further aim is to investigate speech recognition ability in normal hearing, hearing-impaired, and aging populations. As such, the purpose is to obtain neural measures from EEG as well as fMRI that shed light on the neural bases of multisensory processes, while connecting them to model based measures of reaction time and accuracy in the behavioral domain. In doing so, we endeavor to gain a more thorough description of the neural bases and mechanisms underlying integration in higher order processes such as speech and language recognition.
English
Creative Commons NonCommercial-NoDerivs CC by-nc-nd https://creativecommons.org/licenses/http://journal.frontiersin.org/researchtopic/1120/audiovisual-speech-recognition-correspondence-between-brain-and-behavior
Description based on online resource; title from PDF title page (viewed on 07/08/2020)
Audiovisual Integration: An Introduction to Behavioral and Neuro-Cognitive Methods / Nicholas Altieri -- Speech Through Ears and Eyes: Interfacing the Senses With the Supramodal Brain / Virginie van Wassenhove -- Neural Dynamics of Audiovisual Speech Integration Under Variable Listening Conditions: An Individual Participant Analysis / Nicholas Altieri and Michael J. Wenger -- Gated Audiovisual Speech Identification in Silence vs. Noise: Effects on Time and Accuracy / Shahram Moradi, Björn Lidestam and Jerker Rönnberg -- Susceptibility to a Multisensory Speech Illusion in Older Persons is Driven by Perceptual Processes / Annalisa Setti, Kate E. Burke, Rose Anne Kenny and Fiona N. Newell -- How Can Audiovisual Pathways Enhance the Temporal Resolution of TimeCompressed Speech in Blind Subjects? / Ingo Hertrich, Susanne Dietrich and Hermann Ackermann -- Audio-Visual Onset Differences are used to Determine Syllable Identity for Ambiguous Audio-Visual Stimulus Pairs / Sanne ten Oever, Alexander T. Sack, Katherine L. Wheat, Nina Bien and Nienke van Atteveldt -- Brain Responses and Looking Behavior During Audiovisual Speech Integration in Infants Predict Auditory Speech Comprehension in the Second Year of Life / Elena V. Kushnerenko, Przemyslaw Tomalski, Haiko Ballieux, Anita Potton, Deidre Birtles, Caroline Frostick and Derek G. Moore -- Multisensory Integration, Learning, and the Predictive Coding Hypothesis / Nicholas Altieri -- The Interaction Between Stimulus Factors and Cognitive Factors During Multisensory Integration of Audiovisual Speech / Ryan A. Stevenson, Mark T. Wallace and Nicholas Altieri -- Caregiver Influence on Looking Behavior and Brain Responses in Prelinguistic Development / Heather L. Ramsdell-Hudock.
Open access Unrestricted online access star
Cognitive science.
Psychology.
Social Sciences HILCC
Models of Integration
Audiovisual speech and aging
Integration Efficiency
Multisensory language development
Visual prediction
Audiovisual integration
imaging
Altieri, Nicholas, editor.
2-88919-251-2
language English
format Electronic
eBook
author Nicholas Altieri
spellingShingle Nicholas Altieri
Audiovisual speech recognition correspondence between brain and behavior /
Frontiers Research Topics
Audiovisual Integration: An Introduction to Behavioral and Neuro-Cognitive Methods / Nicholas Altieri -- Speech Through Ears and Eyes: Interfacing the Senses With the Supramodal Brain / Virginie van Wassenhove -- Neural Dynamics of Audiovisual Speech Integration Under Variable Listening Conditions: An Individual Participant Analysis / Nicholas Altieri and Michael J. Wenger -- Gated Audiovisual Speech Identification in Silence vs. Noise: Effects on Time and Accuracy / Shahram Moradi, Björn Lidestam and Jerker Rönnberg -- Susceptibility to a Multisensory Speech Illusion in Older Persons is Driven by Perceptual Processes / Annalisa Setti, Kate E. Burke, Rose Anne Kenny and Fiona N. Newell -- How Can Audiovisual Pathways Enhance the Temporal Resolution of TimeCompressed Speech in Blind Subjects? / Ingo Hertrich, Susanne Dietrich and Hermann Ackermann -- Audio-Visual Onset Differences are used to Determine Syllable Identity for Ambiguous Audio-Visual Stimulus Pairs / Sanne ten Oever, Alexander T. Sack, Katherine L. Wheat, Nina Bien and Nienke van Atteveldt -- Brain Responses and Looking Behavior During Audiovisual Speech Integration in Infants Predict Auditory Speech Comprehension in the Second Year of Life / Elena V. Kushnerenko, Przemyslaw Tomalski, Haiko Ballieux, Anita Potton, Deidre Birtles, Caroline Frostick and Derek G. Moore -- Multisensory Integration, Learning, and the Predictive Coding Hypothesis / Nicholas Altieri -- The Interaction Between Stimulus Factors and Cognitive Factors During Multisensory Integration of Audiovisual Speech / Ryan A. Stevenson, Mark T. Wallace and Nicholas Altieri -- Caregiver Influence on Looking Behavior and Brain Responses in Prelinguistic Development / Heather L. Ramsdell-Hudock.
author_facet Nicholas Altieri
Altieri, Nicholas,
author_variant n a na
author2 Altieri, Nicholas,
author2_variant n a na
author2_role TeilnehmendeR
author_sort Nicholas Altieri
title Audiovisual speech recognition correspondence between brain and behavior /
title_sub correspondence between brain and behavior /
title_full Audiovisual speech recognition [electronic resource] : correspondence between brain and behavior / topic editor Nicholas Altieri.
title_fullStr Audiovisual speech recognition [electronic resource] : correspondence between brain and behavior / topic editor Nicholas Altieri.
title_full_unstemmed Audiovisual speech recognition [electronic resource] : correspondence between brain and behavior / topic editor Nicholas Altieri.
title_auth Audiovisual speech recognition correspondence between brain and behavior /
title_new Audiovisual speech recognition
title_sort audiovisual speech recognition correspondence between brain and behavior /
series Frontiers Research Topics
series2 Frontiers Research Topics
publisher Frontiers Media SA
Frontiers Media SA,
publishDate 2014
physical 1 online resource (101 pages) : illustrations, charts; digital, PDF file(s).
contents Audiovisual Integration: An Introduction to Behavioral and Neuro-Cognitive Methods / Nicholas Altieri -- Speech Through Ears and Eyes: Interfacing the Senses With the Supramodal Brain / Virginie van Wassenhove -- Neural Dynamics of Audiovisual Speech Integration Under Variable Listening Conditions: An Individual Participant Analysis / Nicholas Altieri and Michael J. Wenger -- Gated Audiovisual Speech Identification in Silence vs. Noise: Effects on Time and Accuracy / Shahram Moradi, Björn Lidestam and Jerker Rönnberg -- Susceptibility to a Multisensory Speech Illusion in Older Persons is Driven by Perceptual Processes / Annalisa Setti, Kate E. Burke, Rose Anne Kenny and Fiona N. Newell -- How Can Audiovisual Pathways Enhance the Temporal Resolution of TimeCompressed Speech in Blind Subjects? / Ingo Hertrich, Susanne Dietrich and Hermann Ackermann -- Audio-Visual Onset Differences are used to Determine Syllable Identity for Ambiguous Audio-Visual Stimulus Pairs / Sanne ten Oever, Alexander T. Sack, Katherine L. Wheat, Nina Bien and Nienke van Atteveldt -- Brain Responses and Looking Behavior During Audiovisual Speech Integration in Infants Predict Auditory Speech Comprehension in the Second Year of Life / Elena V. Kushnerenko, Przemyslaw Tomalski, Haiko Ballieux, Anita Potton, Deidre Birtles, Caroline Frostick and Derek G. Moore -- Multisensory Integration, Learning, and the Predictive Coding Hypothesis / Nicholas Altieri -- The Interaction Between Stimulus Factors and Cognitive Factors During Multisensory Integration of Audiovisual Speech / Ryan A. Stevenson, Mark T. Wallace and Nicholas Altieri -- Caregiver Influence on Looking Behavior and Brain Responses in Prelinguistic Development / Heather L. Ramsdell-Hudock.
isbn 2-88919-251-2
callnumber-first B - Philosophy, Psychology, Religion
callnumber-subject BF - Psychology
callnumber-label BF463
callnumber-sort BF 3463 S64
illustrated Illustrated
dewey-hundreds 100 - Philosophy & psychology
dewey-tens 150 - Psychology
dewey-ones 153 - Mental processes & intelligence
dewey-full 153.6
dewey-sort 3153.6
dewey-raw 153.6
dewey-search 153.6
work_keys_str_mv AT nicholasaltieri audiovisualspeechrecognitioncorrespondencebetweenbrainandbehavior
AT altierinicholas audiovisualspeechrecognitioncorrespondencebetweenbrainandbehavior
status_str c
ids_txt_mv (CKB)3710000000504555
(SSID)ssj0001664972
(PQKBManifestationID)16454261
(PQKBTitleCode)TC0001664972
(PQKBWorkID)14999642
(PQKB)11704633
(WaSeSS)IndRDA00055884
(oapen)https://directory.doabooks.org/handle/20.500.12854/41553
(EXLCZ)993710000000504555
carrierType_str_mv cr
is_hierarchy_title Audiovisual speech recognition correspondence between brain and behavior /
author2_original_writing_str_mv noLinkedField
_version_ 1796648795513028608
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>06653cam a2200625 i 4500</leader><controlfield tag="001">993547287204498</controlfield><controlfield tag="005">20240328153451.0</controlfield><controlfield tag="006">m fs d </controlfield><controlfield tag="007">cr#mn#---|||||</controlfield><controlfield tag="008">160829s2014 sz ad fob 000 0 eng d</controlfield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(CKB)3710000000504555</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(SSID)ssj0001664972</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(PQKBManifestationID)16454261</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(PQKBTitleCode)TC0001664972</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(PQKBWorkID)14999642</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(PQKB)11704633</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(WaSeSS)IndRDA00055884</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(oapen)https://directory.doabooks.org/handle/20.500.12854/41553</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EXLCZ)993710000000504555</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">PQKB</subfield><subfield code="d">UkMaJRU</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1=" " ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">BF463.S64</subfield></datafield><datafield tag="082" ind1="0" ind2="0"><subfield code="a">153.6</subfield><subfield code="2">23</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Nicholas Altieri</subfield><subfield code="4">auth</subfield></datafield><datafield tag="245" ind1="0" ind2="0"><subfield code="a">Audiovisual speech recognition</subfield><subfield code="h">[electronic resource] :</subfield><subfield code="b">correspondence between brain and behavior /</subfield><subfield code="c">topic editor Nicholas Altieri.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="b">Frontiers Media SA</subfield><subfield code="c">2014</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Lausanne, Switzerland :</subfield><subfield code="b">Frontiers Media SA,</subfield><subfield code="c">2014.</subfield></datafield><datafield tag="264" ind1=" " ind2="4"><subfield code="c">©2014</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (101 pages) :</subfield><subfield code="b">illustrations, charts; digital, PDF file(s).</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="347" ind1=" " ind2=" "><subfield code="a">text file</subfield><subfield code="2">rda</subfield></datafield><datafield tag="490" ind1="0" ind2=" "><subfield code="a">Frontiers Research Topics</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Bibliographic Level Mode of Issuance: Monograph</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Published in Frontiers in Psychology.</subfield></datafield><datafield tag="504" ind1=" " ind2=" "><subfield code="a">Includes bibliographical references.</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Perceptual processes mediating recognition, including the recognition of objects and spoken words, is inherently multisensory. This is true in spite of the fact that sensory inputs are segregated in early stages of neuro-sensory encoding. In face-to-face communication, for example, auditory information is processed in the cochlea, encoded in auditory sensory nerve, and processed in lower cortical areas. Eventually, these “sounds” are processed in higher cortical pathways such as the auditory cortex where it is perceived as speech. Likewise, visual information obtained from observing a talker’s articulators is encoded in lower visual pathways. Subsequently, this information undergoes processing in the visual cortex prior to the extraction of articulatory gestures in higher cortical areas associated with speech and language. As language perception unfolds, information garnered from visual articulators interacts with language processing in multiple brain regions. This occurs via visual projections to auditory, language, and multisensory brain regions. The association of auditory and visual speech signals makes the speech signal a highly “configural” percept. An important direction for the field is thus to provide ways to measure the extent to which visual speech information influences auditory processing, and likewise, assess how the unisensory components of the signal combine to form a configural/integrated percept. Numerous behavioral measures such as accuracy (e.g., percent correct, susceptibility to the “McGurk Effect”) and reaction time (RT) have been employed to assess multisensory integration ability in speech perception. On the other hand, neural based measures such as fMRI, EEG and MEG have been employed to examine the locus and or time-course of integration. The purpose of this Research Topic is to find converging behavioral and neural based assessments of audiovisual integration in speech perception. A further aim is to investigate speech recognition ability in normal hearing, hearing-impaired, and aging populations. As such, the purpose is to obtain neural measures from EEG as well as fMRI that shed light on the neural bases of multisensory processes, while connecting them to model based measures of reaction time and accuracy in the behavioral domain. In doing so, we endeavor to gain a more thorough description of the neural bases and mechanisms underlying integration in higher order processes such as speech and language recognition.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">English</subfield></datafield><datafield tag="540" ind1=" " ind2=" "><subfield code="a">Creative Commons NonCommercial-NoDerivs</subfield><subfield code="f">CC by-nc-nd</subfield><subfield code="u">https://creativecommons.org/licenses/http://journal.frontiersin.org/researchtopic/1120/audiovisual-speech-recognition-correspondence-between-brain-and-behavior</subfield></datafield><datafield tag="588" ind1=" " ind2=" "><subfield code="a">Description based on online resource; title from PDF title page (viewed on 07/08/2020)</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Audiovisual Integration: An Introduction to Behavioral and Neuro-Cognitive Methods / Nicholas Altieri -- Speech Through Ears and Eyes: Interfacing the Senses With the Supramodal Brain / Virginie van Wassenhove -- Neural Dynamics of Audiovisual Speech Integration Under Variable Listening Conditions: An Individual Participant Analysis / Nicholas Altieri and Michael J. Wenger -- Gated Audiovisual Speech Identification in Silence vs. Noise: Effects on Time and Accuracy / Shahram Moradi, Björn Lidestam and Jerker Rönnberg -- Susceptibility to a Multisensory Speech Illusion in Older Persons is Driven by Perceptual Processes / Annalisa Setti, Kate E. Burke, Rose Anne Kenny and Fiona N. Newell -- How Can Audiovisual Pathways Enhance the Temporal Resolution of TimeCompressed Speech in Blind Subjects? / Ingo Hertrich, Susanne Dietrich and Hermann Ackermann -- Audio-Visual Onset Differences are used to Determine Syllable Identity for Ambiguous Audio-Visual Stimulus Pairs / Sanne ten Oever, Alexander T. Sack, Katherine L. Wheat, Nina Bien and Nienke van Atteveldt -- Brain Responses and Looking Behavior During Audiovisual Speech Integration in Infants Predict Auditory Speech Comprehension in the Second Year of Life / Elena V. Kushnerenko, Przemyslaw Tomalski, Haiko Ballieux, Anita Potton, Deidre Birtles, Caroline Frostick and Derek G. Moore -- Multisensory Integration, Learning, and the Predictive Coding Hypothesis / Nicholas Altieri -- The Interaction Between Stimulus Factors and Cognitive Factors During Multisensory Integration of Audiovisual Speech / Ryan A. Stevenson, Mark T. Wallace and Nicholas Altieri -- Caregiver Influence on Looking Behavior and Brain Responses in Prelinguistic Development / Heather L. Ramsdell-Hudock.</subfield></datafield><datafield tag="506" ind1=" " ind2=" "><subfield code="a">Open access</subfield><subfield code="f">Unrestricted online access</subfield><subfield code="2">star</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Cognitive science.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Psychology.</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Social Sciences</subfield><subfield code="2">HILCC</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Models of Integration</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Audiovisual speech and aging</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Integration Efficiency</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Multisensory language development</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Visual prediction</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Audiovisual integration</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">imaging</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Altieri, Nicholas,</subfield><subfield code="e">editor.</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">2-88919-251-2</subfield></datafield><datafield tag="906" ind1=" " ind2=" "><subfield code="a">BOOK</subfield></datafield><datafield tag="ADM" ind1=" " ind2=" "><subfield code="b">2024-03-29 00:57:02 Europe/Vienna</subfield><subfield code="d">00</subfield><subfield code="f">system</subfield><subfield code="c">marc21</subfield><subfield code="a">2015-11-22 13:16:40 Europe/Vienna</subfield><subfield code="g">false</subfield></datafield><datafield tag="AVE" ind1=" " ind2=" "><subfield code="i">DOAB Directory of Open Access Books</subfield><subfield code="P">DOAB Directory of Open Access Books</subfield><subfield code="x">https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&amp;portfolio_pid=5338458750004498&amp;Force_direct=true</subfield><subfield code="Z">5338458750004498</subfield><subfield code="b">Available</subfield><subfield code="8">5338458750004498</subfield></datafield></record></collection>