"The future of surgery is not about blood and guts; the future of surgery is about bits and bytes.”
/Dr. Richard Satava/

Friday, August 27, 2010

Patient motion tracking and compensation in IGS

A major focus of my research targets patient motion tracking. To my knowledge, there is very little data published on this issue (e.g. Westermann2000) This post is a short introduction to the topic to raise interest and to give a general overview. If you know of more than the referenced papers below, please let me know!

Image-guided surgery requires trackable markers, used as references. The event of patient motion occurs when the body’s position moves relative to the base frame of the device executing the surgical plan. The fundamental problem with patient motion is that without proper identification and compensation, the whole surgical plan may be obsolete, and the treatment potentially harmful. From the clinical point of view, maximum a few mm of error could be tolerated, depending the speed of the tool, it might mean 0.5–2 s delay. If it is noticed in time, re-registration is recommended to avoid damaging the patient. However, re-registration is usually time consuming (and might be cumbersome), therefore it should be avoided, whenever possible. From the technical point of view, many sources of errors can be represented as patient motion. The main sources of external (i.e., excluding physiological) patient motion during surgery include:
  • large forces applied by surgeon (e.g., bone milling),
  • bumping into the operating table,
  • leaning against the patient,
  • inadequate fixation,
  • equipment failure.
Dynamic correction for unforseen events with the use of typically deployed intraoperative navigation systems remains a significant challenge. While significant effort has been invested to describe the surgical workflow with mathematical models [Jannin2007, Lahiri2010], relatively few projects have dealt with the modeling of the OR setup and environment in general.
The robot’s position information and the tracking data must be kept consistent throughout the operation, especially in the case of neurosurgical or orthopedic procedures, where the accuracy is absolutely crucial. Practically, this can be achieved with a rigid mechanical fixation between the device and the patient. Smaller robots, such as the SmartAssist (Mazor Surgical Technologies Inc., Caesarea, Israel) [Plaskos2005] or the Mini Bone-Attached Robotic System (MBARS, ICAOS and Carnegie Mellon University, U.S.) [Wolf2005] may be bone-mounted. This requires more invasive fixation on the patient side (bone screws), and large forces may still cause relative motion between the patient and the tool. In orthopedics, there are significant contact forces, making it necessary to use stronger screws.
Employing a large, powerful robot may lead to serious tissue damage. The ROBODOC system (Curexo Technology Corporation, Fremont, CA) [Kazanzides2008] was the first FDA approved automated bone milling robot for hip replacement, bone screws and a bone motion sensor to detect fixation failures. If the bone moves more than 2 mm despite the fixation, the system halts, and calls for re-registration.
One option to reduce tissue trauma is to use multiple dynamic reference bases to follow the motion of the robot base and the patient separately. Unfortunately, not every tracking system supports this, and it may cause difficulties to maintain the line-of-sight without disturbing the physician. Extending the active workspace of a tracking system may result in higher inherent accuracies due to the inhomogeneity of its field. Some commercially available systems combine surface-mounted and in-body fiducials to track external and physiological organ motion, though it requires a separate operation placing the markers. A successful example is the CyberKnife radiation therapy system (Accuray Inc., Sunnyvale, CA) that can track skin motion through a special suit and organ motion by taking bi-plane x-ray images and locating fiducials (gold beads) that were implanted pre-operatively [Saito2009]. Other groups tried different filtering approaches with limited effectiveness [Baron2010].
Robotic setups could incorporate accelerometers and gyroscopes, primarily to detect sudden changes; however these require electrical coupling and their resolution might not be sufficient for proper compensation. Besides, these would increase the costs and complexity of the system. Charge-coupled device (CCD) cameras can survey the OR, and image processing techniques could solve the localization problem, but the resolution may not be high enough, and it might have significant hardware requirements.Dynamic registration and correction for patient motion has been implemented with PET/SPECT scans [Fulton2002, Bruyant2005, Rahmim2007] to improve image quality through compensated reconstruction. However, these setups only considered rigid environment, where neither the camera, nor the PET gantry move.

Wednesday, August 18, 2010

CIS & Interventional Robotics for Tutorial at ICABB

I'm happy to be the co-organizer of this promising event with Dr. Fichtinger. Location: ICABB conference, Venice, 12. October 2010. All people interested are welcome!
"An important aim is to help the ICABB participants recognize imminent opportunities for the employment of their existing disciplinary expertise and qualifications, hence widening the cadre of interdisciplinary experts working in the broader field of medical engineering. The tutorial will provide a general overview and survey of the state of the art. This talk will lay out the contextual blueprint for the rest of the tutorial. Following this layout, expert speakers will take a closer view at major areas of issues in their respective field of specialty. Each talk will cover relevant theoretical and practical considerations, thereby providing a comprehensive picture of the problems, available solutions, and future directions of this field. In addition, internationally prominent clinical researchers in computer and robot assisted interventions will provide unique perspectives on a wide range of challenges and opportunities relevant to the ICABB researchers."

Tentative schedule:
  • Gabor Fichtinger, PhD Introduction to interventional robotics, robot-assisted prostate cancer treatment
  • Lena Maier-Hein, PhD Computer-assisted laparoscopic surgery: challenges, perspectives and limitations
  • Franjo Pernuš, PhD 3D/2D image registration–the enabling technology for image-guided medical interventions
  • Gernot Kronreif, PhD Surgical robotic systems—abdominal needle placement
  • Birkfellner Wolfgang, PhD Motion tracking in radio-oncology
  • Tamas Haidegger Medical robot systems’ accuracy, safety and validation

Tuesday, August 10, 2010

CIS research at TUM

The Technische Universität München (TUM) is widely known for the high quality research they have been conducting in the field of CIS for many years.
A great example is the da-Vinci replacement MIS system developed for research purposes at TUM incorporating two KUKA and a Stäubli robot (see on the side). TUM constructed a fully operational system with the use of inexpensive standard components (including actuators, sensors, software), with open API (Knoll, ICRA 2009). As the control parameters and sensory information is accessible at any time, it is possible to implement semi-autonomous algorithms and perform experiments on human-machine skill transfer. Another setup—the EndoPAR—was built at the German Heart Center in Munich with three ceiling-mounted Mitsubishi manipulators.

They have several engineering and clinical groups, a really impressive project list:
  • Bone Removal by Laser Ablation
  • Involvment in the MIROSURGE development
  • Micro Robot for Precision Instrument Manipulation
  • Mastoid Control
  • Soft Tissue Navigation System for Open Liver Surgery
  • Realistic Soft Tissue Liver Phantom development
  • Robot-assisted Milling in Neuro Surgical Applications
  • Graphics card based 3D algebraic reconstruction of x-ray projection images
  • Navigation and Instrument Control for Craniofacial Surgery
  • Computer Aided ENT Surgery
  • Computer Aided Dental Surgery
  • Shared control in MIS
  • Automated knot tying.

Tuesday, August 3, 2010

3D Slicer

Recently, I attended Ron Kikinis' lecture on Slicer, and decided to advertise it a little. "Slicer, or 3D Slicer, is a free, open source software package for visualization and image analysis. 3D Slicer is natively designed to be available on multiple platforms, including Windows, Linux and Mac Os X." It is a very powerful tool for all kind of medical imaging work, distributed freely under BSD-style open source license. Several groups are using it for various projects, and the National Alliance for Medical Image Computing (NA-MIC) coordinates the development efforts. "NA-MIC was funded in September of 2004 after submission of an application in response to an RFA issued by NIH. The algorithm core develops and implements medical image computing algorithms using the NA-MIC Kit. The engineering core develops and maintains the NA-MIC Kit, a software platform designed to enable research. The driving biological projects use the tools provided by the algorithm and engineering cores to develop software solutions that further their biomedical research. The training and dissemination cores work on both internal and external outreach. The service core supports the virtualized IT infrastructure that enables all these activities in a distributed environment. The leadership core is responsible for the overall direction of the alliance."
The 3.6 stable version of Slicer was released more recently, featuring:
  • Improved Interactive Editor
  • New Color Module
  • Improved Volume Rendering
  • EM Segmenter, simple version
  • Fast Marching Segmentation
  • Robust Statistics Segmentation
  • New Registration module
  • New Slices module
  • Fiducial based tractography
  • Improved SceneSnapshot Screen Capture functionality
  • 4D Image Viewer
  • Compare View and Cross Hairs
You can download it here.