Events:May12 2008 FirstMondaySeminar
Contents
Location and Time
- Monday, May 12, 2008
- 12:00 - 1:00pm
- Anesthesia Conference Room, L1, 75 Francis St, Boston, MA 02115
- Direction:Enter at the 75 Francis Street entrance. Bear to your right down the Brigham and Women's "ground pike". You will see the Connors Center for Women and Newborns elevator on your left. If you walk past the copper statues, you have gone too far. Take the elevator down to L-1. Exit the elevator; you will see a sign pointing to the Anesthesia Conference Room. The room is located at the end of the hall on the right hand side.
Action-and Workflow-driven Augmented Reality for Computer Aided Medical Procedures
Abstract
A principal difference between Augmented Reality and Virtual Reality is that it does not require the user to immerse into a virtual world. However, AR systems often require additional equipment and displays that for short periods force user immersion into a new environment to see the augmentations. A solution is to create user interfaces that take advantage of AR only when required. One key to the success of such a user interface is its ability to automatically recognize different phases of a workflow, each of which may require various levels of augmentation. It is also important for the AR system to be transparent to the user during the rest of the procedure. These issues have greater importance when dealing with computer-aided surgery applications. In most of these applications, a surgeon needs augmentation for only quite brief periods, such as choosing the ports for a laparoscopic intervention or localizing the major arteries before starting a liver resection. These augmentations, however, can play an important role in the overall procedure's success. During the past three years, CAMP has worked toward development of such integrated AR solutions in the context of minimally invasive surgery. We chose to move into hospital laboratories, creating an environment where surgeons and computer scientists could model the medical procedures and work together to invent new solutions. In this talk, Dr. Navab presents an overview of CAMP's activities and some recent results. The focus is on recovering the workflow, modeling medical procedures and intelligently integrating advanced data fusion, navigation and visualization into such procedures. Professor Navab will present different solutions for trauma surgery, nuclear medicine and interventional radiology applications. Phantom, ex-vivo and in-vivo experiments demonstrate the advantages of such new solutions.
Dr. Nassir Navab is a full professor and director of the institute for Computer Aided Medical Procedures and Augmented Reality (CAMP) at Technical University of Munich (http://campar.in.tum.de). He has also a secondary faculty appointment at the Medical School of TU Munich. Before joining the Computer Science Department at TUM, he was a distinguished member at Siemens Corporate Research (SCR) in Princeton, New Jersey. He received the prestigious Siemens Inventor of the Year Award in 2001 for the body of his work in interventional imaging. He had received his PhD from INRIA and University of Paris XI in France and enjoyed two years of postdoctoral fellowship at MIT Media Laboratory before joining SCR in 1994. In November 2006, he was elected as a member of board of directors of MICCAI society. He has been serving on the Steering Committee of the IEEE Symposium on Mixed and Augmented Reality since 2001. He has served on the program committee of over 30 international conferences. He is the author of hundreds of peer reviewed scientific papers and over 40 US and international patents. His main fields of interest include: Medical Augmented Reality, Computer Aided Surgery and Medical Image Registration.