Loading [MathJax]/extensions/tex2jax.js

Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • Armin Dietz
  • Andreas Pösch
  • Eduard Reithmeier
Plum Print visual indicator of research metrics
  • Citations
    • Citation Indexes: 2
  • Captures
    • Readers: 8
see details

Details

Original languageEnglish
Title of host publicationMedical Imaging 2018
Subtitle of host publicationImaging Informatics for Healthcare, Research, and Applications
EditorsPo-Hao Chen, Jianguo Zhang
PublisherSPIE
Number of pages6
ISBN (electronic)9781510616479
Publication statusPublished - 6 Mar 2018
EventMedical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications - Houston, United States
Duration: 13 Feb 201815 Feb 2018

Publication series

NameProgress in Biomedical Optics and Imaging - Proceedings of SPIE
Volume10579
ISSN (Print)1605-7422

Abstract

The number of health-care associated infections is increasing worldwide. Hand hygiene has been identified as one of the most crucial measures to prevent bacteria from spreading. However, compliance with recommended procedures for hand hygiene is generally poor, even in modern, industrialized regions. We present an optical assistance system for monitoring the hygienic hand disinfection procedure which is based on machine learning. Firstly, each hand and underarm of a person is detected in a down-sampled 96 px x 96 px depth video stream by pixelwise classification using a fully convolutional network. To gather the required amount of training data, we present a novel approach in automatically labeling recorded data using colored gloves and a color video stream that is registered to the depth stream. The colored gloves are used to segment the depth data in the training phase. During inference, the colored gloves are not required. The system detects and separates detailed hand parts of interacting, self-occluded hands within the observation zone of the sensor. Based on the location of the segmented hands, a full resolution region of interest (ROI) is cropped. A second deep neural network classifies the ROI into ten separate process steps (gestures), with nine of them based on the recommended hand disinfection procedure of the World Health Organization, and an additional error class. The combined system is cross-validated with 21 subjects and predicts with an accuracy of 93.37% (± 2.67%) which gesture is currently executed. The feedback is provided with 30 frames per second.

Keywords

    Gesture recognition, Hand hygiene, Hand tracking, Machine learning, Segmentation

ASJC Scopus subject areas

Cite this

Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. / Dietz, Armin; Pösch, Andreas; Reithmeier, Eduard.
Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications. ed. / Po-Hao Chen; Jianguo Zhang. SPIE, 2018. 1057914 (Progress in Biomedical Optics and Imaging - Proceedings of SPIE; Vol. 10579).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Dietz, A, Pösch, A & Reithmeier, E 2018, Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. in P-H Chen & J Zhang (eds), Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications., 1057914, Progress in Biomedical Optics and Imaging - Proceedings of SPIE, vol. 10579, SPIE, Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications, Houston, United States, 13 Feb 2018. https://doi.org/10.1117/12.2294047, https://doi.org/10.15488/3833
Dietz, A., Pösch, A., & Reithmeier, E. (2018). Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. In P.-H. Chen, & J. Zhang (Eds.), Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications Article 1057914 (Progress in Biomedical Optics and Imaging - Proceedings of SPIE; Vol. 10579). SPIE. https://doi.org/10.1117/12.2294047, https://doi.org/10.15488/3833
Dietz A, Pösch A, Reithmeier E. Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. In Chen PH, Zhang J, editors, Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications. SPIE. 2018. 1057914. (Progress in Biomedical Optics and Imaging - Proceedings of SPIE). doi: 10.1117/12.2294047, 10.15488/3833
Dietz, Armin ; Pösch, Andreas ; Reithmeier, Eduard. / Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks. Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications. editor / Po-Hao Chen ; Jianguo Zhang. SPIE, 2018. (Progress in Biomedical Optics and Imaging - Proceedings of SPIE).
Download
@inproceedings{fee6b5c3427044a79d1c234c47f8deb0,
title = "Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks",
abstract = "The number of health-care associated infections is increasing worldwide. Hand hygiene has been identified as one of the most crucial measures to prevent bacteria from spreading. However, compliance with recommended procedures for hand hygiene is generally poor, even in modern, industrialized regions. We present an optical assistance system for monitoring the hygienic hand disinfection procedure which is based on machine learning. Firstly, each hand and underarm of a person is detected in a down-sampled 96 px x 96 px depth video stream by pixelwise classification using a fully convolutional network. To gather the required amount of training data, we present a novel approach in automatically labeling recorded data using colored gloves and a color video stream that is registered to the depth stream. The colored gloves are used to segment the depth data in the training phase. During inference, the colored gloves are not required. The system detects and separates detailed hand parts of interacting, self-occluded hands within the observation zone of the sensor. Based on the location of the segmented hands, a full resolution region of interest (ROI) is cropped. A second deep neural network classifies the ROI into ten separate process steps (gestures), with nine of them based on the recommended hand disinfection procedure of the World Health Organization, and an additional error class. The combined system is cross-validated with 21 subjects and predicts with an accuracy of 93.37% (± 2.67%) which gesture is currently executed. The feedback is provided with 30 frames per second.",
keywords = "Gesture recognition, Hand hygiene, Hand tracking, Machine learning, Segmentation",
author = "Armin Dietz and Andreas P{\"o}sch and Eduard Reithmeier",
year = "2018",
month = mar,
day = "6",
doi = "10.1117/12.2294047",
language = "English",
series = "Progress in Biomedical Optics and Imaging - Proceedings of SPIE",
publisher = "SPIE",
editor = "Po-Hao Chen and Jianguo Zhang",
booktitle = "Medical Imaging 2018",
address = "United States",
note = "Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications ; Conference date: 13-02-2018 Through 15-02-2018",

}

Download

TY - GEN

T1 - Hand hygiene monitoring based on segmentation of interacting hands with convolutional networks

AU - Dietz, Armin

AU - Pösch, Andreas

AU - Reithmeier, Eduard

PY - 2018/3/6

Y1 - 2018/3/6

N2 - The number of health-care associated infections is increasing worldwide. Hand hygiene has been identified as one of the most crucial measures to prevent bacteria from spreading. However, compliance with recommended procedures for hand hygiene is generally poor, even in modern, industrialized regions. We present an optical assistance system for monitoring the hygienic hand disinfection procedure which is based on machine learning. Firstly, each hand and underarm of a person is detected in a down-sampled 96 px x 96 px depth video stream by pixelwise classification using a fully convolutional network. To gather the required amount of training data, we present a novel approach in automatically labeling recorded data using colored gloves and a color video stream that is registered to the depth stream. The colored gloves are used to segment the depth data in the training phase. During inference, the colored gloves are not required. The system detects and separates detailed hand parts of interacting, self-occluded hands within the observation zone of the sensor. Based on the location of the segmented hands, a full resolution region of interest (ROI) is cropped. A second deep neural network classifies the ROI into ten separate process steps (gestures), with nine of them based on the recommended hand disinfection procedure of the World Health Organization, and an additional error class. The combined system is cross-validated with 21 subjects and predicts with an accuracy of 93.37% (± 2.67%) which gesture is currently executed. The feedback is provided with 30 frames per second.

AB - The number of health-care associated infections is increasing worldwide. Hand hygiene has been identified as one of the most crucial measures to prevent bacteria from spreading. However, compliance with recommended procedures for hand hygiene is generally poor, even in modern, industrialized regions. We present an optical assistance system for monitoring the hygienic hand disinfection procedure which is based on machine learning. Firstly, each hand and underarm of a person is detected in a down-sampled 96 px x 96 px depth video stream by pixelwise classification using a fully convolutional network. To gather the required amount of training data, we present a novel approach in automatically labeling recorded data using colored gloves and a color video stream that is registered to the depth stream. The colored gloves are used to segment the depth data in the training phase. During inference, the colored gloves are not required. The system detects and separates detailed hand parts of interacting, self-occluded hands within the observation zone of the sensor. Based on the location of the segmented hands, a full resolution region of interest (ROI) is cropped. A second deep neural network classifies the ROI into ten separate process steps (gestures), with nine of them based on the recommended hand disinfection procedure of the World Health Organization, and an additional error class. The combined system is cross-validated with 21 subjects and predicts with an accuracy of 93.37% (± 2.67%) which gesture is currently executed. The feedback is provided with 30 frames per second.

KW - Gesture recognition

KW - Hand hygiene

KW - Hand tracking

KW - Machine learning

KW - Segmentation

UR - http://www.scopus.com/inward/record.url?scp=85047721885&partnerID=8YFLogxK

U2 - 10.1117/12.2294047

DO - 10.1117/12.2294047

M3 - Conference contribution

AN - SCOPUS:85047721885

T3 - Progress in Biomedical Optics and Imaging - Proceedings of SPIE

BT - Medical Imaging 2018

A2 - Chen, Po-Hao

A2 - Zhang, Jianguo

PB - SPIE

T2 - Medical Imaging 2018: Imaging Informatics for Healthcare, Research, and Applications

Y2 - 13 February 2018 through 15 February 2018

ER -