Difference between revisions of "NAC Grid Enabled ITK"
From NAMIC Wiki
(→Tasks) |
(→Tasks) |
||
(2 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
== Workplan == | == Workplan == | ||
− | # Extend ITK to support batch processing in distributed environments. We will provide a framework for wrapping ITK’s filters so that they can run as services in a distributed processing environment. Specific examples will also be developed to demonstrate the use of that framework to run ITK segmentation and registrations methods on a Condor grid (www.cs.wisc.edu/condor). The Condor grid computations will be directed by the open-source, freely available BatchMake system (www.batchmake.org). | + | # Extend ITK to support batch processing in distributed environments. We will provide a framework for wrapping ITK’s filters so that they can run as services in a distributed processing environment. Specific examples will also be developed to demonstrate the use of that framework to run ITK segmentation and registrations methods on a Condor grid ([http://www.cs.wisc.edu/condor]). The Condor grid computations will be directed by the open-source, freely available BatchMake system ([http://www.batchmake.org]). |
# Provide a development environment that integrates method compilation, testing, monitoring, and comparison. The CMake and CTest tools of ITK ensure cross-platform compilation and consistent test results during method development. Consistency testing for an ITK method, however, is typically accomplished by determining success or failure on a single dataset. Such consistency testing does not promote incremental method improvement or quantitative method comparison. We propose to leverage the framework proposed in Aim 1 and to integrate BatchMake with CMake so that routinely testing on hundreds of datasets will become an inherent part of the ITK method development process. In that manner, developers will be able to track the subtle as well as the acute effects of code changes. Furthermore, by processing massive collections of data, statistically significant comparisons between methods can be made. | # Provide a development environment that integrates method compilation, testing, monitoring, and comparison. The CMake and CTest tools of ITK ensure cross-platform compilation and consistent test results during method development. Consistency testing for an ITK method, however, is typically accomplished by determining success or failure on a single dataset. Such consistency testing does not promote incremental method improvement or quantitative method comparison. We propose to leverage the framework proposed in Aim 1 and to integrate BatchMake with CMake so that routinely testing on hundreds of datasets will become an inherent part of the ITK method development process. In that manner, developers will be able to track the subtle as well as the acute effects of code changes. Furthermore, by processing massive collections of data, statistically significant comparisons between methods can be made. | ||
# Demonstrate the use and utility of the technologies of Aim 1 and 2 by developing user-friendly grid modules for ITK method parameter tuning. To effectively use a method, it is essential to understand its parameter space in the context of the application domain. Such understanding is gained via factor analysis. Experience with ITK and other toolkits has shown that factor analysis is particularly important when applying deformable registration strategies, including the parallelized registration methods that were developed during the previous (8/1/2006 to 7/31/2007) funding period on this grant. We propose to develop stand-alone modules as well as 3D Slicer-based modules that provide intuitive interfaces for conducting factor analysis experiments using batch processing. We will apply those modules to the registration methods developed during the previous funding period in the domain of inter-subject head MRI registration for atlas-based segmentation. | # Demonstrate the use and utility of the technologies of Aim 1 and 2 by developing user-friendly grid modules for ITK method parameter tuning. To effectively use a method, it is essential to understand its parameter space in the context of the application domain. Such understanding is gained via factor analysis. Experience with ITK and other toolkits has shown that factor analysis is particularly important when applying deformable registration strategies, including the parallelized registration methods that were developed during the previous (8/1/2006 to 7/31/2007) funding period on this grant. We propose to develop stand-alone modules as well as 3D Slicer-based modules that provide intuitive interfaces for conducting factor analysis experiments using batch processing. We will apply those modules to the registration methods developed during the previous funding period in the domain of inter-subject head MRI registration for atlas-based segmentation. | ||
Line 49: | Line 49: | ||
##* Record processing steps for journaling, publication, and experiment replication | ##* Record processing steps for journaling, publication, and experiment replication | ||
##* Characterize data for upload to repository and future access/processing | ##* Characterize data for upload to repository and future access/processing | ||
− | ##* Chosen | + | ##* Chosen provenance paradigm: MBIRN/XCede : http://www.nbirn.net/tools/bxh_tools/index.shtm |
== Administration == | == Administration == |
Latest revision as of 21:45, 24 September 2007
Home < NAC Grid Enabled ITKBack to ITK_Registration_Optimization
Workplan
- Extend ITK to support batch processing in distributed environments. We will provide a framework for wrapping ITK’s filters so that they can run as services in a distributed processing environment. Specific examples will also be developed to demonstrate the use of that framework to run ITK segmentation and registrations methods on a Condor grid ([1]). The Condor grid computations will be directed by the open-source, freely available BatchMake system ([2]).
- Provide a development environment that integrates method compilation, testing, monitoring, and comparison. The CMake and CTest tools of ITK ensure cross-platform compilation and consistent test results during method development. Consistency testing for an ITK method, however, is typically accomplished by determining success or failure on a single dataset. Such consistency testing does not promote incremental method improvement or quantitative method comparison. We propose to leverage the framework proposed in Aim 1 and to integrate BatchMake with CMake so that routinely testing on hundreds of datasets will become an inherent part of the ITK method development process. In that manner, developers will be able to track the subtle as well as the acute effects of code changes. Furthermore, by processing massive collections of data, statistically significant comparisons between methods can be made.
- Demonstrate the use and utility of the technologies of Aim 1 and 2 by developing user-friendly grid modules for ITK method parameter tuning. To effectively use a method, it is essential to understand its parameter space in the context of the application domain. Such understanding is gained via factor analysis. Experience with ITK and other toolkits has shown that factor analysis is particularly important when applying deformable registration strategies, including the parallelized registration methods that were developed during the previous (8/1/2006 to 7/31/2007) funding period on this grant. We propose to develop stand-alone modules as well as 3D Slicer-based modules that provide intuitive interfaces for conducting factor analysis experiments using batch processing. We will apply those modules to the registration methods developed during the previous funding period in the domain of inter-subject head MRI registration for atlas-based segmentation.
- Document. We will document the framework, code, classes, modules, and applications to meet the rigorous standards of ITK. Material will be added to the Insight Software Guide textbook. Articles will be submitted to the Insight Journal.
- Final Report. We will submit a final report describing the technologies, their uses, and their benefits.
Tasks
In no particular order at this time
- Batch processing from within Slicer
- BatchMake updated to use same XML command description as Slicer
- BatchMake for specific Slicer modules (Otsu, ...)
- BatchMake atlas building with Martin Styner, Brad D., Paulina, ...
- BatchMake documentation
- "hello world" example
- "hello world" example in Slicer
- Tutorials
- General batch processing modules in Slicer
- Enable parameter analysis and batch processing of current and future Slicer execution modules
- MIDAS data collection use from within Slicer
- Search
- Initiate from within Slicer
- Provide PACs-like interface to collections returned by search
- Access
- SSH mount from within slicer?
- Download from within slicer
- Process
- Automatically provide batchmake script/params for accessing downloaded/mounted data (search results)
- Data providence (see below)
- Search
- Slicer Validation
- STAPLE
- Parallelize EM framework of ITK
- Automated validation of slicer modules
- Segmentation and registration accuracy metrics
- STAPLE
- Registration
- Integrate threaded registration modules from 2006-2007 NAC with ITK
- Group-wise registration
- Fluid registration
- Registration for EMSegmenter
- Data handling
- IO streaming (input and output)
- Data providence
- Characterize search results for future processing
- Record search results for experiment replication
- Record processing steps for journaling, publication, and experiment replication
- Characterize data for upload to repository and future access/processing
- Chosen provenance paradigm: MBIRN/XCede : http://www.nbirn.net/tools/bxh_tools/index.shtm
Administration
- Supplement to the 2007-2008 BWH NAC grant
- Subcontract to Kitware
- Contact: Stephen Aylward