Details
Original language | English |
---|---|
Pages (from-to) | 1961-1968 |
Number of pages | 8 |
Journal | International journal of computer assisted radiology and surgery |
Volume | 18 |
Issue number | 11 |
Early online date | 2 Aug 2023 |
Publication status | Published - Nov 2023 |
Abstract
Purpose: A basic task of a robotic scrub nurse is surgical instrument detection. Deep learning techniques could potentially address this task; nevertheless, their performance is subject to some degree of error, which could render them unsuitable for real-world applications. In this work, we aim to demonstrate how the combination of a trained instrument detector with an instance-based voting scheme that considers several frames and viewpoints is enough to guarantee a strong improvement in the instrument detection task. Methods: We exploit the typical setup of a robotic scrub nurse to collect RGB data and point clouds from different viewpoints. Using trained Mask R-CNN models, we obtain predictions from each view. We propose a multi-view voting scheme based on predicted instances that combines the gathered data and predictions to produce a reliable map of the location of the instruments in the scene. Results: Our approach reduces the number of errors by more than 82% compared with the single-view case. On average, the data from five viewpoints are sufficient to infer the correct instrument arrangement with our best model. Conclusion: Our approach can drastically improve an instrument detector’s performance. Our method is practical and can be applied during an actual medical procedure without negatively affecting the surgical workflow. Our implementation and data are made available for the scientific community (https://github.com/Jorebs/Multi-view-Voting-Scheme).
Keywords
- Mask R-CNN, Multi-viewpoint inference, Robot-assisted surgery, Robotic scrub nurse, Surgical instrument detection
ASJC Scopus subject areas
- Medicine(all)
- Surgery
- Engineering(all)
- Biomedical Engineering
- Medicine(all)
- Radiology Nuclear Medicine and imaging
- Computer Science(all)
- Computer Vision and Pattern Recognition
- Computer Science(all)
- Computer Science Applications
- Medicine(all)
- Health Informatics
- Computer Science(all)
- Computer Graphics and Computer-Aided Design
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: International journal of computer assisted radiology and surgery, Vol. 18, No. 11, 11.2023, p. 1961-1968.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Improving instrument detection for a robotic scrub nurse using multi-view voting
AU - Badilla-Solórzano, Jorge
AU - Ihler, Sontje
AU - Gellrich, Nils Claudius
AU - Spalthoff, Simon
N1 - Funding Information: The main author wants to offer his gratitude to the University of Costa Rica for providing financial support, enabling the completion of the hereby presented research.
PY - 2023/11
Y1 - 2023/11
N2 - Purpose: A basic task of a robotic scrub nurse is surgical instrument detection. Deep learning techniques could potentially address this task; nevertheless, their performance is subject to some degree of error, which could render them unsuitable for real-world applications. In this work, we aim to demonstrate how the combination of a trained instrument detector with an instance-based voting scheme that considers several frames and viewpoints is enough to guarantee a strong improvement in the instrument detection task. Methods: We exploit the typical setup of a robotic scrub nurse to collect RGB data and point clouds from different viewpoints. Using trained Mask R-CNN models, we obtain predictions from each view. We propose a multi-view voting scheme based on predicted instances that combines the gathered data and predictions to produce a reliable map of the location of the instruments in the scene. Results: Our approach reduces the number of errors by more than 82% compared with the single-view case. On average, the data from five viewpoints are sufficient to infer the correct instrument arrangement with our best model. Conclusion: Our approach can drastically improve an instrument detector’s performance. Our method is practical and can be applied during an actual medical procedure without negatively affecting the surgical workflow. Our implementation and data are made available for the scientific community (https://github.com/Jorebs/Multi-view-Voting-Scheme).
AB - Purpose: A basic task of a robotic scrub nurse is surgical instrument detection. Deep learning techniques could potentially address this task; nevertheless, their performance is subject to some degree of error, which could render them unsuitable for real-world applications. In this work, we aim to demonstrate how the combination of a trained instrument detector with an instance-based voting scheme that considers several frames and viewpoints is enough to guarantee a strong improvement in the instrument detection task. Methods: We exploit the typical setup of a robotic scrub nurse to collect RGB data and point clouds from different viewpoints. Using trained Mask R-CNN models, we obtain predictions from each view. We propose a multi-view voting scheme based on predicted instances that combines the gathered data and predictions to produce a reliable map of the location of the instruments in the scene. Results: Our approach reduces the number of errors by more than 82% compared with the single-view case. On average, the data from five viewpoints are sufficient to infer the correct instrument arrangement with our best model. Conclusion: Our approach can drastically improve an instrument detector’s performance. Our method is practical and can be applied during an actual medical procedure without negatively affecting the surgical workflow. Our implementation and data are made available for the scientific community (https://github.com/Jorebs/Multi-view-Voting-Scheme).
KW - Mask R-CNN
KW - Multi-viewpoint inference
KW - Robot-assisted surgery
KW - Robotic scrub nurse
KW - Surgical instrument detection
UR - http://www.scopus.com/inward/record.url?scp=85166510779&partnerID=8YFLogxK
U2 - 10.1007/s11548-023-03002-0
DO - 10.1007/s11548-023-03002-0
M3 - Article
C2 - 37530904
AN - SCOPUS:85166510779
VL - 18
SP - 1961
EP - 1968
JO - International journal of computer assisted radiology and surgery
JF - International journal of computer assisted radiology and surgery
SN - 1861-6410
IS - 11
ER -