Details
Original language | English |
---|---|
Title of host publication | Proceedings of the 2005 IEEE International Conference on Robotics and Automation |
Pages | 2167-2172 |
Number of pages | 6 |
Publication status | Published - 2005 |
Externally published | Yes |
Event | 2005 IEEE International Conference on Robotics and Automation - Barcelona, Spain Duration: 18 Apr 2005 → 22 Apr 2005 |
Publication series
Name | Proceedings - IEEE International Conference on Robotics and Automation |
---|---|
Volume | 2005 |
ISSN (Print) | 1050-4729 |
Abstract
Minimally invasive surgery in combination with ultrasound (US) imaging imposes high demands on the surgeon's hand-eye-coordination capabilities. A possible solution to reduce these requirements is minimally invasive robotic surgery in which the instrument is guided by visual servoing towards the goal defined by the surgeon in the US image. This approach requires robust tracking of the instrument in the US image sequences which is known to be difficult due to poor image quality. This paper presents computer vision algorithms and results of visual servoing experiments. Adaptive thresholding according to Otsu's method allows to cope with large intensity variations of the instrument echo. Subsequently applied morphological operations suppress noise and echo artefacts. A fast labelling algorithm based on run length coding allows for realtime labelling of the regions. A heuristic exploiting region size and region velocity helps to overcome ambiguities. The overall computation time is less than 10 ms per frame on a standard PC. The tracking algorithm requires no information about texture and shape which are known to be very unreliable in US image sequences. Experimental results for different instrument materials (polyvinyl chloride, polyurethane, nylon, and plexiglas) are given, illustrating the performance of the proposed approach: when chosing the appropriate material the reconstructed trajectories are smooth and only few outliers occur. As a consequence, the visual servoing loop showed to be robust and stable.
Keywords
- Minimally invasive surgery, Ultrasound tracking, Visual servoing
ASJC Scopus subject areas
- Computer Science(all)
- Software
- Engineering(all)
- Control and Systems Engineering
- Computer Science(all)
- Artificial Intelligence
- Engineering(all)
- Electrical and Electronic Engineering
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
Proceedings of the 2005 IEEE International Conference on Robotics and Automation. 2005. p. 2167-2172 1570434 (Proceedings - IEEE International Conference on Robotics and Automation; Vol. 2005).
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Robust Real-Time Instrument Tracking in Ultrasound Images for Visual Servoing
AU - Ortmaier, T.
AU - Vitrani, M. A.
AU - Morel, G.
AU - Pinault, S.
PY - 2005
Y1 - 2005
N2 - Minimally invasive surgery in combination with ultrasound (US) imaging imposes high demands on the surgeon's hand-eye-coordination capabilities. A possible solution to reduce these requirements is minimally invasive robotic surgery in which the instrument is guided by visual servoing towards the goal defined by the surgeon in the US image. This approach requires robust tracking of the instrument in the US image sequences which is known to be difficult due to poor image quality. This paper presents computer vision algorithms and results of visual servoing experiments. Adaptive thresholding according to Otsu's method allows to cope with large intensity variations of the instrument echo. Subsequently applied morphological operations suppress noise and echo artefacts. A fast labelling algorithm based on run length coding allows for realtime labelling of the regions. A heuristic exploiting region size and region velocity helps to overcome ambiguities. The overall computation time is less than 10 ms per frame on a standard PC. The tracking algorithm requires no information about texture and shape which are known to be very unreliable in US image sequences. Experimental results for different instrument materials (polyvinyl chloride, polyurethane, nylon, and plexiglas) are given, illustrating the performance of the proposed approach: when chosing the appropriate material the reconstructed trajectories are smooth and only few outliers occur. As a consequence, the visual servoing loop showed to be robust and stable.
AB - Minimally invasive surgery in combination with ultrasound (US) imaging imposes high demands on the surgeon's hand-eye-coordination capabilities. A possible solution to reduce these requirements is minimally invasive robotic surgery in which the instrument is guided by visual servoing towards the goal defined by the surgeon in the US image. This approach requires robust tracking of the instrument in the US image sequences which is known to be difficult due to poor image quality. This paper presents computer vision algorithms and results of visual servoing experiments. Adaptive thresholding according to Otsu's method allows to cope with large intensity variations of the instrument echo. Subsequently applied morphological operations suppress noise and echo artefacts. A fast labelling algorithm based on run length coding allows for realtime labelling of the regions. A heuristic exploiting region size and region velocity helps to overcome ambiguities. The overall computation time is less than 10 ms per frame on a standard PC. The tracking algorithm requires no information about texture and shape which are known to be very unreliable in US image sequences. Experimental results for different instrument materials (polyvinyl chloride, polyurethane, nylon, and plexiglas) are given, illustrating the performance of the proposed approach: when chosing the appropriate material the reconstructed trajectories are smooth and only few outliers occur. As a consequence, the visual servoing loop showed to be robust and stable.
KW - Minimally invasive surgery
KW - Ultrasound tracking
KW - Visual servoing
UR - http://www.scopus.com/inward/record.url?scp=33750277596&partnerID=8YFLogxK
U2 - 10.1109/ROBOT.2005.1570434
DO - 10.1109/ROBOT.2005.1570434
M3 - Conference contribution
AN - SCOPUS:33750277596
SN - 078038914X
SN - 9780780389144
T3 - Proceedings - IEEE International Conference on Robotics and Automation
SP - 2167
EP - 2172
BT - Proceedings of the 2005 IEEE International Conference on Robotics and Automation
T2 - 2005 IEEE International Conference on Robotics and Automation
Y2 - 18 April 2005 through 22 April 2005
ER -