Multisensory and sensorimotor interactions in speech perception


Book Description

Speech is multisensory since it is perceived through several senses. Audition is the most important one as speech is mostly heard. The role of vision has long been acknowledged since many articulatory gestures can be seen on the talker's face. Sometimes speech can even be felt by touching the face. The best-known multisensory illusion is the McGurk effect, where incongruent visual articulation changes the auditory percept. The interest in the McGurk effect arises from a major general question in multisensory research: How is information from different senses combined? Despite decades of research, a conclusive explanation for the illusion remains elusive. This is a good demonstration of the challenges in the study of multisensory integration. Speech is special in many ways. It is the main means of human communication, and a manifestation of a unique language system. It is a signal with which all humans have a lot of experience. We are exposed to it from birth, and learn it through development in face-to-face contact with others. It is a signal that we can both perceive and produce. The role of the motor system in speech perception has been debated for a long time. Despite very active current research, it is still unclear to which extent, and in which role, the motor system is involved in speech perception. Recent evidence shows that brain areas involved in speech production are activated during listening to speech and watching a talker's articulatory gestures. Speaking involves coordination of articulatory movements and monitoring their auditory and somatosensory consequences. How do auditory, visual, somatosensory, and motor brain areas interact during speech perception? How do these sensorimotor interactions contribute to speech perception? It is surprising that despite a vast amount of research, the secrets of speech perception have not yet been solved. The multisensory and sensorimotor approaches provide new opportunities in solving them. Contributions to the research topic are encouraged for a wide spectrum of research on speech perception in multisensory and sensorimotor contexts, including novel experimental findings ranging from psychophysics to brain imaging, theories and models, reviews and opinions.




The Neural Bases of Multisensory Processes


Book Description

It has become accepted in the neuroscience community that perception and performance are quintessentially multisensory by nature. Using the full palette of modern brain imaging and neuroscience methods, The Neural Bases of Multisensory Processes details current understanding in the neural bases for these phenomena as studied across species, stages of development, and clinical statuses. Organized thematically into nine sub-sections, the book is a collection of contributions by leading scientists in the field. Chapters build generally from basic to applied, allowing readers to ascertain how fundamental science informs the clinical and applied sciences. Topics discussed include: Anatomy, essential for understanding the neural substrates of multisensory processing Neurophysiological bases and how multisensory stimuli can dramatically change the encoding processes for sensory information Combinatorial principles and modeling, focusing on efforts to gain a better mechanistic handle on multisensory operations and their network dynamics Development and plasticity Clinical manifestations and how perception and action are affected by altered sensory experience Attention and spatial representations The last sections of the book focus on naturalistic multisensory processes in three separate contexts: motion signals, multisensory contributions to the perception and generation of communication signals, and how the perception of flavor is generated. The text provides a solid introduction for newcomers and a strong overview of the current state of the field for experts.




Multisensory Development


Book Description

We perceive and understand our environment using many sensory systems-vision, touch, hearing, taste, smell, and proprioception. These multiple sensory modalities give us complementary sources of information about the environment. This book explores how we develop the ability to integrate our senses.




Multisensory Processes


Book Description

Auditory behavior, perception, and cognition are all shaped by information from other sensory systems. This volume examines this multi-sensory view of auditory function at levels of analysis ranging from the single neuron to neuroimaging in human clinical populations. Visual Influence on Auditory Perception Adrian K.C. Lee and Mark T. Wallace Cue Combination within a Bayesian Framework David Alais and David Burr Toward a Model of Auditory-Visual Speech Intelligibility Ken W. Grant and Joshua G. W. Bernstein An Object-based Interpretation of Audiovisual Processing Adrian K.C. Lee, Ross K. Maddox, and Jennifer K. Bizley Hearing in a "Moving" Visual World: Coordinate Transformations Along the Auditory Pathway Shawn M. Willett, Jennifer M. Groh, Ross K. Maddox Multisensory Processing in the Auditory Cortex Andrew J. King, Amy Hammond-Kenny, Fernando R. Nodal Audiovisual Integration in the Primate Prefrontal Cortex Bethany Plakke and Lizabeth M. Romanski Using Multisensory Integration to Understand Human Auditory Cortex Michael S. Beauchamp Combining Voice and Face Content in the Primate Temporal Lobe Catherine Perrodin and Christopher I. Petkov Neural Network Dynamics and Audiovisual Integration Julian Keil and Daniel Senkowski Cross-Modal Learning in the Auditory System Patrick Bruns and Brigitte Röder Multisensory Processing Differences in Individuals with Autism Spectrum Disorder Sarah H. Baum Miller, Mark T. Wallace Adrian K.C. Lee is Associate Professor in the Department of Speech & Hearing Sciences and the Institute for Learning and Brain Sciences at the University of Washington, Seattle Mark T. Wallace is the Louise B McGavock Endowed Chair and Professor in the Departments of Hearing and Speech Sciences, Psychiatry, Psychology and Director of the Vanderbilt Brain Institute at Vanderbilt University, Nashville Allison B. Coffin is Associate Professor in the Department of Integrative Physiology and Neuroscience at Washington State University, Vancouver, WA Arthur N. Popper is Professor Emeritus and research professor in the Department of Biology at the University of Maryland, College Park Richard R. Fay is Distinguished Research Professor of Psychology at Loyola University, Chicago.




Audiovisual Speech Processing


Book Description

This book presents a complete overview of all aspects of audiovisual speech including perception, production, brain processing and technology.




Neural Organization


Book Description

In Neural Organization, Arbib, Erdi, and Szentagothai integrate structural, functional, and dynamical approaches to the interaction of brain models and neurobiologcal experiments. Both structure-based "bottom-up" and function- based "top-down" models offer coherent concepts by which to evaluate the experimental data. The goal of this book is to point out the advantages of a multidisciplinary, multistrategied approach to the brain.Part I of Neural Organization provides a detailed introduction to each of the three areas of structure, function, and dynamics. Structure refers to the anatomical aspects of the brain and the relations between different brain regions. Function refers to skills and behaviors, which are explained by means of functional schemas and biologically based neural networks. Dynamics refers to the use of a mathematical framework to analyze the temporal change of neural activities and synaptic connectivities that underlie brain development and plasticity--in terms of both detailed single-cell models and large-scale network models.In part II, the authors show how their systematic approach can be used to analyze specific parts of the nervous system--the olfactory system, hippocampus, thalamus, cerebral cortex, cerebellum, and basal ganglia--as well as to integrate data from the study of brain regions, functional models, and the dynamics of neural networks. In conclusion, they offer a plan for the use of their methods in the development of cognitive neuroscience."




Multisensory Imagery


Book Description

Is a pear sweeter than a peach? Which of Mona Lisa’s hands is crossed over the other? What would the Moonlight Sonata sound like played by a brass band? Although these are questions that appeal to mental imagery in a variety of sensory modalities, mental imagery research has been dominated by visual imagery. With the emergence of a well-established multisensory research community, however, it is time to look at mental imagery in a wider sensory context. Part I of this book provides overviews of unisensory imagery in each sensory modality, including motor imagery, together with discussions of multisensory and cross-modal interactions, synesthesia, imagery in the blind and following brain damage, and methodological considerations. Part II reviews the application of mental imagery research in a range of settings including individual differences, skilled performance such as sports and surgical training, psychopathology and therapy, through to stroke rehabilitation. This combination of comprehensive coverage of the senses with reviews from both theoretical and applied perspectives not only complements the growing multisensory literature but also responds to recent calls for translational research in the multisensory field.




The Handbook of Multisensory Processes


Book Description

Research is suggesting that rather than our senses being independent, perception is fundamentally a multisensory experience. This handbook reviews the evidence and explores the theory of broad underlying principles that govern sensory interactions, regardless of the specific senses involved.




Intersensory Facilitation


Book Description

Simple reaction time to stimuli from different sensory modalities presented simultaneously typically is shorter than reaction time to a single stimulus. In this study, auditory, visual, and tactile stimuli were presented in different combinations and at varying stimulus onset asynchronies. Two different types of models for the observed reaction time facilitation effects are developed and tested. Separate activation (race) type models assume that stimulus information in different sensory channels is processed in parallel and independently while coactivation type models allow interactions across different channels. Using Boole's inequality as a test for separate activation models it could be shown that these models cannot predict as much facilitation as observed. A superposition and a diffusion model of coactivation provided a promising quantitative approximation to the data.




Sensory Linguistics


Book Description

One of the most fundamental capacities of language is the ability to express what speakers see, hear, feel, taste, and smell. Sensory Linguistics is the interdisciplinary study of how language relates to the senses. This book deals with such foundational questions as: Which semiotic strategies do speakers use to express sensory perceptions? Which perceptions are easier to encode and which are “ineffable”? And what are appropriate methods for studying the sensory aspects of linguistics? After a broad overview of the field, a detailed quantitative corpus-based study of English sensory adjectives and their metaphorical uses is presented. This analysis calls age-old ideas into question, such as the idea that the use of perceptual metaphors is governed by a cognitively motivated “hierarchy of the senses”. Besides making theoretical contributions to cognitive linguistics, this research monograph showcases new empirical methods for studying lexical semantics using contemporary statistical methods.