Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Eike Jakubowitz
  • Thekla Feist
  • Alina Obermeier
  • Carina Gempfer
  • Christof Hurschler
  • Henning Windhagen
  • Max Heinrich Laves

Research Organisations

External Research Organisations

  • Hannover Medical School (MHH)
  • Hamburg University of Technology (TUHH)
View graph of relations

Details

Original languageEnglish
Article number5728
Number of pages17
JournalApplied Sciences (Switzerland)
Volume13
Issue number9
Publication statusPublished - 6 May 2023

Abstract

Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.

Keywords

    activities of daily living, brain–computer interface, electroencephalography, movement decoding, prosthetic control

ASJC Scopus subject areas

Sustainable Development Goals

Cite this

Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study. / Jakubowitz, Eike; Feist, Thekla; Obermeier, Alina et al.
In: Applied Sciences (Switzerland), Vol. 13, No. 9, 5728, 06.05.2023.

Research output: Contribution to journalArticleResearchpeer review

Jakubowitz, E., Feist, T., Obermeier, A., Gempfer, C., Hurschler, C., Windhagen, H., & Laves, M. H. (2023). Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study. Applied Sciences (Switzerland), 13(9), Article 5728. https://doi.org/10.3390/app13095728, https://doi.org/10.15488/14099
Jakubowitz E, Feist T, Obermeier A, Gempfer C, Hurschler C, Windhagen H et al. Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study. Applied Sciences (Switzerland). 2023 May 6;13(9):5728. doi: 10.3390/app13095728, 10.15488/14099
Jakubowitz, Eike ; Feist, Thekla ; Obermeier, Alina et al. / Early Predictability of Grasping Movements by Neurofunctional Representations : A Feasibility Study. In: Applied Sciences (Switzerland). 2023 ; Vol. 13, No. 9.
Download
@article{25fbffa0fa294c9288d1d243b2d79f20,
title = "Early Predictability of Grasping Movements by Neurofunctional Representations: A Feasibility Study",
abstract = "Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.",
keywords = "activities of daily living, brain–computer interface, electroencephalography, movement decoding, prosthetic control",
author = "Eike Jakubowitz and Thekla Feist and Alina Obermeier and Carina Gempfer and Christof Hurschler and Henning Windhagen and Laves, {Max Heinrich}",
note = "Funding Information: This study received funding from the European Union{\textquoteright}s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 688857, called “SoftPro”. ",
year = "2023",
month = may,
day = "6",
doi = "10.3390/app13095728",
language = "English",
volume = "13",
journal = "Applied Sciences (Switzerland)",
issn = "2076-3417",
publisher = "Multidisciplinary Digital Publishing Institute",
number = "9",

}

Download

TY - JOUR

T1 - Early Predictability of Grasping Movements by Neurofunctional Representations

T2 - A Feasibility Study

AU - Jakubowitz, Eike

AU - Feist, Thekla

AU - Obermeier, Alina

AU - Gempfer, Carina

AU - Hurschler, Christof

AU - Windhagen, Henning

AU - Laves, Max Heinrich

N1 - Funding Information: This study received funding from the European Union’s Horizon 2020 Research and Innovation Programme under Grant Agreement No. 688857, called “SoftPro”.

PY - 2023/5/6

Y1 - 2023/5/6

N2 - Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.

AB - Human grasping is a relatively fast process and control signals for upper limb prosthetics cannot be generated and processed in a sufficiently timely manner. The aim of this study was to examine whether discriminating between different grasping movements at a cortical level can provide information prior to the actual grasping process, allowing for more intuitive prosthetic control. EEG datasets were captured from 13 healthy subjects who repeatedly performed 16 activities of daily living. Common classifiers were trained on features extracted from the waking-state frequency and total-frequency time domains. Different training scenarios were used to investigate whether classifiers can already be pre-trained by base networks for fine-tuning with data of a target person. A support vector machine algorithm with spatial covariance matrices as EEG signal descriptors based on Riemannian geometry showed the highest balanced accuracy (0.91 ± 0.05 SD) in discriminating five grasping categories according to the Cutkosky taxonomy in an interval from 1.0 s before to 0.5 s after the initial movement. Fine-tuning did not improve any classifier. No significant accuracy differences between the two frequency domains were apparent (p > 0.07). Neurofunctional representations enabled highly accurate discrimination of five different grasping movements. Our results indicate that, for upper limb prosthetics, it is possible to use them in a sufficiently timely manner and to predict the respective grasping task as a discrete category to kinematically prepare the prosthetic hand.

KW - activities of daily living

KW - brain–computer interface

KW - electroencephalography

KW - movement decoding

KW - prosthetic control

UR - http://www.scopus.com/inward/record.url?scp=85159325216&partnerID=8YFLogxK

U2 - 10.3390/app13095728

DO - 10.3390/app13095728

M3 - Article

AN - SCOPUS:85159325216

VL - 13

JO - Applied Sciences (Switzerland)

JF - Applied Sciences (Switzerland)

SN - 2076-3417

IS - 9

M1 - 5728

ER -