Auditory Modulation of Multisensory Representations

Publikation: Buch/Bericht/Sammelwerk/KonferenzbandMonografieForschungPeer-Review

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
Seitenumfang28
PublikationsstatusVeröffentlicht - 2018

Publikationsreihe

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Band11265 LNCS
ISSN (Print)0302-9743
ISSN (elektronisch)1611-3349

Abstract

Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.

ASJC Scopus Sachgebiete

Zitieren

Auditory Modulation of Multisensory Representations. / Effenberg, A.O.; Hwang, T.-H.; Ghai, S. et al.
2018. 28 S. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 11265 LNCS).

Publikation: Buch/Bericht/Sammelwerk/KonferenzbandMonografieForschungPeer-Review

Effenberg, AO, Hwang, T-H, Ghai, S & Schmitz, G 2018, Auditory Modulation of Multisensory Representations. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bd. 11265 LNCS. https://doi.org/10.1007/978-3-030-01692-0_20
Effenberg, A. O., Hwang, T.-H., Ghai, S., & Schmitz, G. (2018). Auditory Modulation of Multisensory Representations. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 11265 LNCS). https://doi.org/10.1007/978-3-030-01692-0_20
Effenberg AO, Hwang TH, Ghai S, Schmitz G. Auditory Modulation of Multisensory Representations. 2018. 28 S. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-030-01692-0_20
Effenberg, A.O. ; Hwang, T.-H. ; Ghai, S. et al. / Auditory Modulation of Multisensory Representations. 2018. 28 S. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Download
@book{597ef27a59eb468a800187a73c72ce39,
title = "Auditory Modulation of Multisensory Representations",
abstract = "Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.",
keywords = "Interpersonal coordination, Motor control, Motor learning, Movement sonification, Multimodal integration, Multimodal perception, Perceptuomotor representation",
author = "A.O. Effenberg and T.-H. Hwang and S. Ghai and G. Schmitz",
note = "Publisher Copyright: {\textcopyright} 2018, Springer Nature Switzerland AG.",
year = "2018",
doi = "10.1007/978-3-030-01692-0_20",
language = "English",
isbn = "9783030016913",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

Download

TY - BOOK

T1 - Auditory Modulation of Multisensory Representations

AU - Effenberg, A.O.

AU - Hwang, T.-H.

AU - Ghai, S.

AU - Schmitz, G.

N1 - Publisher Copyright: © 2018, Springer Nature Switzerland AG.

PY - 2018

Y1 - 2018

N2 - Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.

AB - Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.

KW - Interpersonal coordination

KW - Motor control

KW - Motor learning

KW - Movement sonification

KW - Multimodal integration

KW - Multimodal perception

KW - Perceptuomotor representation

UR - http://www.scopus.com/inward/record.url?scp=85057351159&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-01692-0_20

DO - 10.1007/978-3-030-01692-0_20

M3 - Monograph

SN - 9783030016913

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

BT - Auditory Modulation of Multisensory Representations

ER -

Von denselben Autoren