"The future of surgery is not about blood and guts; the future of surgery is about bits and bytes.”
/Dr. Richard Satava/

Saturday, November 27, 2010

Teleoperation research at Keio University

Last week I was given a chance to visit Prof. Ohnishi's lab at Keio University. His laboratories stretch across three campuses of the University, and addresses research focusing on humanoid robotics, flying robot control, complex haptic feedback projection and bilateral teleoperation. This latter one is including a surgical robot setup for basic tissue manipulation. (See the video on the system.) The pure position encoder-based (acceleration measured) haptic feedback is very life-like, and offers great options for scalability.
"We recently launched the development of the surgical robot. In this project, we have successfully established the control law (Acceleration Based Control), which can reproduce the touch of environment in the operator side vividly. We also conducted an experiment using a couple of teleoperation robots over Internet with our partner institution in Europe. We apply our control method, which was initially developed mainly for the use of a surgical robot, to other teleoperation robots as well. For instance, the XY bilateral table pictured above allows an operator to feel “rubbing” and “cutting” actions. Combining various teleoperation robots mentioned above, we are eagerly conducting our research activity to achieve our aim, the development of a novel surgical robot system."
You can find more information about their work in the recent publications:

Friday, November 19, 2010

Extend the control of your brain!

I have long wanted to write about the mind-blowing presentation "Towards a Whole-Body Neuroprosthetic" from Dr. Miguel Nicolelis I attended back in September (a plenary speak at EMBC2010). Dr Nicolelis is a neuroscientist at Duke University. He has become famous for the brain-computer interface research they are conducting using rhesus monkeys. This is not strictly connected to CIS, but interesting enough to address here at SurgRob.  

“Dr. Nicolelis launched a new field of neurophysiological investigation which aims at measuring the concurrent activity and interactions of large populations of single neurons throughout the brain. Thus, for the past 20+ years, Dr. Nicolelis has devoted his career to the search for the physiological principles that govern the operation of key brain circuits in the mammalian brain. For the past decade, Dr. Nicolelis is best known for his pioneering studies of Brain Machine Interfaces (BMI) and neuroprosthetics in human patients and non-human primates.”
His webpage contains a whole lot information about their work, and also, you can learn more through their high-class publications.
Or see these videos: [vid1], [vid2].

In his talk at EMBC, he "reviewed a series of recent experiments demonstrating the possibility of using real-time computational models to investigate how ensembles of neurons encode motor information. These experiments have revealed that brain-machine interfaces can be used not only to study fundamental aspects of neural ensemble physiology, but they can also serve as an experimental paradigm aimed at testing the design of a whole-body neuroprosthetic. I will also describe evidence indicating that continuous operation of a closed-loop brain machine interface, which utilizes a whole-body neuroprosthetic as its main actuator, can induce significant changes in the physiological properties of neurons located in multiple motor and sensory cortical areas. This raises the hypothesis of whether the properties of a whole-body neuroprosthetic, or any other tool, can be assimilated by neuronal representations as if they were simple extensions of the subject's own body."
Recent experiments showed that a monkey had absolutely no problem controlling a third, artificial (and remote) arm with its brain, generating the same brain patterns. 
  • The plasticity of the brain allows for the control of any external device as if it were an extension to the body. 
  • The recorded brain patters (with amazing resolution; 64x64 electrodes) during these experiments seems to underline that we DO NOT HAVE discrete functional areas, the whole cortex is multimodal, taking part in the action generation. What’s more, the brain pattern for the same task is never the same. The brain keeps reinventing the way it directs the body, therefore learning new skills very quickly. 
  • The brain is a simulator that always reaches out through the sensory connections to see what peripheral devices are available. Theoretically, it means that some people can learn to map external objects as a part of their body: “feeling it”. PelĂ© might had a ball mapped onto his legs, allowing him to control it by instinct. 
All this might enable us one day to extend the reach of our brain to control full-body avatars through brain-machine interfaces, while actually living our everyday life. Sounds scary, but also exciting!

Learn more about it at Dr. Nicolelis’ research page

The other major monkey-computer project is ran at University of Pittsburgh

Saturday, November 13, 2010

IEEE Visualization Contest

"The IEEE Visualization Contest targets the field of multimodal visualization for neurosurgical planning. The primary challenge in planning neurosurgical interventions lies in the identification of the various structures at risk and understanding how they relate and interact with each other."  This year the Finals were e held at the IEEE Visualization 2010 in Salt Lake City, late October. 
  • Pre-Operative Planning of Brain Tumor Resections
Honorable mentions:
  • An Exploration and Planning Tool for Neurosurgical Interventions
  • Neurosurgical Intervention Planning with VolV
  • A Fiber Navigator for Neurosurgical Planning (NeuroPlanningNavigator)
  • The 3D Mousing Interaction Technique
  • Multivariate Beam Ray Obstacle Visualization for Brain Tumor Resection
  • Rendering data using three different methods
  • Quantitative Visualization of Access Paths for Preoperative Planning in Neurosurgery
  • Camera Control for Brain-Tumor Visualization
  • Distance Functions in Multimodal Volume Rendering