There have been many interesting applications presented integrating an optical tracker. Probably the most exciting was a clinical system (developed at the Dartmouth Medical Center) that uses microscope imaging combined with navigation for complete removal if tumor tissue. The basic idea behind the project is that the tumors are different (more reddish) under fluoroscopy if injected with d-AFA. Fluorescence-guided neurosurgery is important for the resection of some types of cancerous cells where the tumor and normal tissue are similar in appearance and texture, and patient prognosis depends heavily on the completeness of resection. By selectively tagging tumor tissue with fluorescent dies, it becomes possible to visually discriminate between normal and tumor tissues. Multiple institutes are focusing on combining endoscopic views and 3D Models in Stereo to provide augmented reality view for surgeons. One good placeis the VRVis center in Vienna. Most striking were the examples given by Samuel Kadoury from Montreal Polytechnique of how image guided planning can help to reconstruct serious spinal deformities in only one surgery. Other sensor fusion techniques (US and optical), system frameworks (IGSTK, BIS) and robotic experiments were also presented. Our system, an image guided neurosurgery robot was introduced by Dr. Kazanzides.
It turned out that Medtronic is open for international research cooperations, meaning that they donate a StealthStation surgical navigation system if they get good research results from the use of it. We might take a good advantage of it.