Events:May12 2008 FirstMondaySeminar
Location and Time
- Date: Monday, May 12, 2008
- Time:12:00 - 1:00pm. Individual Q&A with the speakers 1:00 - 1:30pm.
- Location:Anesthesia Conference Room, L1, Brigham and Women's Hospital, 75 Francis St, Boston, MA 02115
- Direction:Enter at the 75 Francis Street entrance. Bear to your right down the Brigham and Women's "ground pike". You will see the Connors Center for Women and Newborns elevator on your left. If you walk past the copper statues, you have gone too far. Take the elevator down to L-1. Exit the elevator; you will see a sign pointing to the Anesthesia Conference Room. The room is located at the end of the hall on the right hand side. Direction to 75 Francis Street entrance of Brigham and Women's Hospital acan be found here.
Action-and Workflow-driven Augmented Reality for Computer Aided Medical Procedures
Abstract
A principal difference between Augmented Reality and Virtual Reality is that it does not require the user to immerse into a virtual world. However, AR systems often require additional equipment and displays that for short periods force user immersion into a new environment to see the augmentations. A solution is to create user interfaces that take advantage of AR only when required. One key to the success of such a user interface is its ability to automatically recognize different phases of a workflow, each of which may require various levels of augmentation. It is also important for the AR system to be transparent to the user during the rest of the procedure. These issues have greater importance when dealing with computer-aided surgery applications. In most of these applications, a surgeon needs augmentation for only quite brief periods, such as choosing the ports for a laparoscopic intervention or localizing the major arteries before starting a liver resection. These augmentations, however, can play an important role in the overall procedure's success. During the past three years, CAMP has worked toward development of such integrated AR solutions in the context of minimally invasive surgery. We chose to move into hospital laboratories, creating an environment where surgeons and computer scientists could model the medical procedures and work together to invent new solutions. In this talk, Dr. Navab presents an overview of CAMP's activities and some recent results. The focus is on recovering the workflow, modeling medical procedures and intelligently integrating advanced data fusion, navigation and visualization into such procedures. Professor Navab will present different solutions for trauma surgery, nuclear medicine and interventional radiology applications. Phantom, ex-vivo and in-vivo experiments demonstrate the advantages of such new solutions.
Dr. Nassir Navab is a full professor and director of the institute for Computer Aided Medical Procedures and Augmented Reality (CAMP) at Technical University of Munich (http://campar.in.tum.de). He has also a secondary faculty appointment at the Medical School of TU Munich. Before joining the Computer Science Department at TUM, he was a distinguished member at Siemens Corporate Research (SCR) in Princeton, New Jersey. He received the prestigious Siemens Inventor of the Year Award in 2001 for the body of his work in interventional imaging. He had received his PhD from INRIA and University of Paris XI in France and enjoyed two years of postdoctoral fellowship at MIT Media Laboratory before joining SCR in 1994. In November 2006, he was elected as a member of board of directors of MICCAI society. He has been serving on the Steering Committee of the IEEE Symposium on Mixed and Augmented Reality since 2001. He has served on the program committee of over 30 international conferences. He is the author of hundreds of peer reviewed scientific papers and over 40 US and international patents. His main fields of interest include: Medical Augmented Reality, Computer Aided Surgery and Medical Image Registration.
http://wwwnavab.in.tum.de/WebHome
Integration of Robotics and Biomedical Measurements for Computer Aided Surgery
by Dr. Ichiro Sakuma of Tokyo University, Japan
Abstract
Intra-operative imaging devices such as CT and MRI are installed in an operating room. They provide detailed anatomical structure of surgical field and enable image-guided operation. There are still limitations of temporal and spatial resolution in the data acquired by these devices. There is also demand for functional information that provides pathological information of patient tissue. Thus, it is required to integrate various types of information obtained by biomedical measurements and those obtained by intra-operative imaging devices. Examples of these devices are an electrophysiological recording system and a spectrophotometric measurement device. These measurement devices should have function to determine their three-dimensional position relative to the patient in addition to their main measurement information when they are integrated in image-guidance system for surgery. Integration of advanced surgical instruments such as surgical robots is also being investigated. As one example of such integration, we integrated an intra-operative fluorescence detection system and laser ablation system. We have been investigating application of 5-aminolevulinic and (5-ALA) induced fluorescence detection to intra-operative detection of brain tumor. 5-ALA is natural chemical substances found in human body. 5-ALA induced Pp9 is produced intracellularly and accumulates selectively in tumor cells. When patient with brain tumor administers 5-ALA before surgery, tumor fluorescence around 635[nm] is observed with excitation light around 405[nm]. We have developed an optical pickup device mounted on a motor driven X-Y stage. The 5-ALA induced fluorescence detection system with a mechanical scanning system was integrated with a surgical navigation system. The fluorescence spectra were registered to the corresponding location in surgical navigation map. The surgeon can utilize both anatomical information and functional information coming from tissue fluorescence. In addition, we integrated a surgical laser ablation system and the fluorescence detection system. It realizes intra-operative fluorescence data driven laser ablation of brain tissue. We tested the system on porcine brain stained with Pp9 induced from 5-ALA. We could ablate the brain surface using pre-scanned fluorescence data.
Dr. Sakuma's Bio
WORK EXPERIENCE
- April 2006 - now
- Professor Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo
- April 1999 -Nov. 2001 - Mar. 2006
- Associate Professor (April 199-Oct2006), Professor (Nov. 2001-Mar.2006) Institute of Environmental Studies, Graduate School of Frontier Sciences, The University of Tokyo
- April 1998 - Mar 1999
- Associate Professor Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo
- Aug 1990 - Sept. 1991
- Research Instructor Department of Surgery, Baylor College of Medicine. Houston, Texas, The U.S.A.
- April 1987 - Mar. 1998
- Research Associate(1997-1991), Instructor(1991-1992), Associate Professor(1992-1998), Department of Applied Electronic Engineering, Faculty of Science and Engineering, Tokyo Denki University
- April 1985 - Mar 1987
- Research Associate Department of Precision Engineering, Graduate School of Engineering, The University of Tokyo
Education
- March 1989
- Ph.D. in Precision Engineering (1989) from Graduate School of Engineering, The University of Tokyo
- March 1984
- Master of Engineering, Department of Precision Machinery Engineering
- Graduate School of Engineering, The University of Tokyo
- Mar 1982
- Bachelor in Engineering , Department of Precision Machinery Engineering, Faculty of Engineering, The University of Tokyo
Membership of Academic Societies
Board member of Japan Society of Computer Aided Surgery Vice president of Japanese Society for Medical and Biological Engineering Board member of Japanese Society for Electrocardiogram Fellow, The Japan Society for Mechanical Engineers Member, The Japan Society for Precision Engineering Member, Japanese Circulation Society Member, IEEE Member International Society for Computer Aided Surgery
Research Interest
Computer Aided Surgery, Surgical Robotics, Biomedical Instrumentation He served as the project leader of JSPS funded research projects on “Development of Surgical Robotic System” from Fy2001-2003. He is one of the sub-project leader of research project on “Intelligent Surgical Devices” funded by METI(NEDO) (FY 2007-2011). He actively participated in the “Translational Systems Biology and Medicine Initiative” in the University of Tokyo funded by JST (FY2007-2009; 1st Phase)