Details
Originalsprache | Englisch |
---|---|
Seitenumfang | 28 |
Publikationsstatus | Veröffentlicht - 2018 |
Publikationsreihe
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Band | 11265 LNCS |
ISSN (Print) | 0302-9743 |
ISSN (elektronisch) | 1611-3349 |
Abstract
Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.
ASJC Scopus Sachgebiete
- Mathematik (insg.)
- Theoretische Informatik
- Informatik (insg.)
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
2018. 28 S. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 11265 LNCS).
Publikation: Buch/Bericht/Sammelwerk/Konferenzband › Monografie › Forschung › Peer-Review
}
TY - BOOK
T1 - Auditory Modulation of Multisensory Representations
AU - Effenberg, A.O.
AU - Hwang, T.-H.
AU - Ghai, S.
AU - Schmitz, G.
N1 - Publisher Copyright: © 2018, Springer Nature Switzerland AG.
PY - 2018
Y1 - 2018
N2 - Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.
AB - Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.
KW - Interpersonal coordination
KW - Motor control
KW - Motor learning
KW - Movement sonification
KW - Multimodal integration
KW - Multimodal perception
KW - Perceptuomotor representation
UR - http://www.scopus.com/inward/record.url?scp=85057351159&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-01692-0_20
DO - 10.1007/978-3-030-01692-0_20
M3 - Monograph
SN - 9783030016913
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
BT - Auditory Modulation of Multisensory Representations
ER -