Difference between revisions of "2009 Summer Project Week Transrectal Prostate biopsy"

From NAMIC Wiki
Jump to: navigation, search
Line 3: Line 3:
 
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]
 
Image:PW2009-v3.png|[[2009_Summer_Project_Week#Projects|Project Week Main Page]]
 
Image:ProstateNavRework.png|Overview of the ProstateNav / TRProstateBiopsy module rework.
 
Image:ProstateNavRework.png|Overview of the ProstateNav / TRProstateBiopsy module rework.
Image:TRPB_ProstateSegmentation.JPG|Prostate segmentation integrated inside TR Prostate biopsy interactive module
+
Image:Capture_20090625_205134.png|Result of a needle insertion experiment on a phantom, using Slicer with a transrectal prostate biopsy robot.
 
Image:ShapeBasePstSegSlicer.png|Shape base prostate segmentation in Slicer through command line module.
 
Image:ShapeBasePstSegSlicer.png|Shape base prostate segmentation in Slicer through command line module.
 
</gallery>
 
</gallery>

Revision as of 13:04, 26 June 2009

Home < 2009 Summer Project Week Transrectal Prostate biopsy

Key Investigators

  • Gabor Fichtinger, Andras Lasso (lasso@cs.queensu.ca), Siddharth Vikal; Queen’s University
  • Allen Tannenbaum, Yi Gao; Georgia Tech
  • Nobuhiko Hata, Junichi Tokuda; BWH

Objective

  • Rework current prostate biopsy modules (TRProstateBiopsy and ProstateNav)
    • Create a new standalone module (ProstateSeg) for prostate segmentation, with the latest version of the algorithms, that can be used from the robot modules or in itself.
    • Merge the two prostate robot modules: This will reduce the total amount of code, make it easier to reuse features developed by different teams for different robots, make the testing and bugfixing more efficient, makes possible to support new robots, scanners, and procedures in the future.

Approach, Plan

  • Prostate segmentation: There are two algorithms (Algorithm 1: Shape based segmentation. The shape of prostates are learned and then the new image is segmented using the shapes learned. Algorithm 2: It is based on the Random Walks segmentation algorithm. It need more human input but the result could be interactively improved arbitrarily close to user's expectation.), integrate both of them into a standalone module.
  • Prostate robotics: Merge the existing modules (select one prostate robotics module as a base, clean it up, design generic robot and scanner interfaces, and integrate functions and specific robot/scanner support parts from the other module), so that all the functionalities will be available for all robots.

Our goals for the project week are:

  • Integrate latest developments of the prostate segmentation algorithms into a standalone Slicer module (ProstateSeg)
  • Create tutorial for ProstateSeg module
  • Prepare the merge of the two prostate robotics Slicer modules: design of wizard steps, design of generic robot/scanner interfaces

Progress

  • During the 2009 Winter Project Week a preliminary version of the segmentation algorithms were already integrated.
  • Design discussions with the ProstateNav developers and porting of some functionality (secondary monitor support) are already started.

Detailed plan of the rework

  • ProstateSeg module
    • Update the tutorial:
      • with a clinical data set
      • detailed instructions about how to draw seed and background points
      • use fiducial points to define segmentation seed (generate small spheres)?
      • define background voxels using a ROI?
    • Determine seed points and background points automatically: image center is in prostate, determine approximate bounding sphere for prostate, use shrinked sphere as seed, expanded sphere for background
    • Performance improvement
      • Reuse current result volume as an initial value for the solution to speed up computations after some changes (adding more seed points or background points)
      • Use multiresolution technique to get a quick approximate result (then use that result as initial value for the full-resolution computation)
  • ProstateNav module
    • Strip down the ProstateNav. We need to have a kind of skeleton before we start merging the two modules.
    • Modify the wizard mechanism. We definitely need a mechanism to configure the wizard to use the module for different procedures / devices.
    • Once we fix the wizard mechanism, we start copying the functions from both modules. Since the skeleton is based on the ProstateNav, and even TRProstateBiopsy module is based on the similar code, it won't be a difficult task.

Design notes for ProstateNav rework

Overview of the ProstateNav / TRProstateBiopsy module rework
  • Configuration: configure the module (mostly wizard steps) to use the module for different procedures/devices
  • Wizard steps:
    • Start up: select a configuration XML file and check devices, connections. The XML file contains:
      • Robot model
      • OpenIGTLink address, DICOM directory/server
      • Needles
      • Screen configuration
    • Calibration
    • Targeting: driving needle to reach targets
    • Manual: manual robot/scanner control
    • Verification
  • Classes
    • ProstateBiopsyRobotNode (MRMLNode): holds all robot data (current position, current needle, available needles, status, visualization options?, calibration data), sending command to robot
    • RobotDisplayWidget: observes ProstateBiopsyRobotNode and displays the robot in the viewer
    • ProstateBiopsyNode (MRMLNode): contain all configuration data, OpenIGTLink, DICOM links, screen config, link to the target list (fiducial list), additional properties for each target (which needle, already completed, etc. – one common superclass to hold data for one target)
    • SetupStep
    • CalibrationStep: robot specific, there should be a common superclass
    • TargetingStep
    • VerificationStep
    • ManualStep: robot specific
    • RobotWidget: specific for each robot; show/hide arm, needle, coverage, calibration object
  • User interface
    • Secondary monitor support
  • Communication
    • OpenIGTLink for direct scanner connection
    • OpenIGTLink for DICOM communication with the Scanner

References

  • Grady, Leo “Random walks for Image Segmentation” IEEE-PAMI 2006
  • S Vikal, Steven Haker, Clare Tempany, Gabor Fichtinger, Prostate contouring in MRI guided biopsy, SPIE Medical Imaging 2009: Image Processing, Proc. SPIE, Vol. 7259, 72594A, 2009
  • S. Vikal, S. Haker, C. Tempany, G Fichtinger, Prostate contouring in MRI guided biopsy, Workshop on Prostate image analysis and computer-assisted intervention, held in conjunction with the 11th International Conference on Medical Image Computing and Computer Assisted Intervention – MICCAI, September 2008.
  • Singh AK, Guion P, Sears Crouse N, Ullman K, Smith S, Albert PS, Fichtinger G, Choyke PL, Xu S, Kruecker J, Wood BJ, Krieger A, Ning H, “Simultaneous Integrated Boost of Biopsy Proven, MRI Defined Dominant Intra-prostatic Lesions to 95 Gray with IMRT: Early Results of a Phase I NCI Study”, Radiat Oncol. 2007 Sep 18;2(1)
  • Singh AK, Krieger A, Lattouf JB, Guion P, Grubb III RL, Albert PS, Metzger G, Ullman K, Fichtinger G, Ocak I, Choyke PL, Ménard C, Coleman J, “Patient Selection Appears To Determine Prostate Cancer Yield Of Dynamic Contrast Enhanced MRI Guided Transrectal Biopsies In A Closed 3 Tesla Scanner”, British Journal of Urology, 2007 Oct 8;