Difference between revisions of "ITK Registration Optimization/2007-04-06-tcon"
From NAMIC Wiki
(→Tests) |
(→Julien) |
||
(4 intermediate revisions by the same user not shown) | |||
Line 7: | Line 7: | ||
** Optimization: OptMattesMI is a self-contained test | ** Optimization: OptMattesMI is a self-contained test | ||
*** Submit to dashboard | *** Submit to dashboard | ||
+ | **** test name / method | ||
**** non-optimized speed | **** non-optimized speed | ||
**** optimized speed | **** optimized speed | ||
Line 21: | Line 22: | ||
* Public submission of performance | * Public submission of performance | ||
** Logins and passwords configured in cmake | ** Logins and passwords configured in cmake | ||
− | ** <em>Encryption in cmake?<em> | + | ** <em>Encryption in cmake?</em> |
* <em>Organization of Experiments/Dashboards</em> | * <em>Organization of Experiments/Dashboards</em> | ||
− | ** <em>When new experiment?<em> | + | ** <em>When new experiment?</em> |
* <em>Appropriate summary statistics</em> | * <em>Appropriate summary statistics</em> | ||
− | ** Per machine: | + | ** Per machine: batch -vs- speed/error |
** Per test: mflops -vs- speed/error | ** Per test: mflops -vs- speed/error | ||
− | ** All, | + | ** All, batch -vs- % change in performance |
=== Review of OptMattesMI === | === Review of OptMattesMI === | ||
Line 37: | Line 38: | ||
* Ideal set of tests is machine specific | * Ideal set of tests is machine specific | ||
** e.g., Number of threads and image size | ** e.g., Number of threads and image size | ||
+ | |||
+ | = Tasks = | ||
+ | |||
+ | === Julien === | ||
+ | # Work with Seb to get reports from Amber2 | ||
+ | #* Result | ||
+ | #** Amber2 is allocated to another project - therefore work will transition to machines at SPL | ||
+ | # Define role of experiments and batches | ||
+ | #* Work with Seb to integrate with cmake dashboard | ||
+ | #* New experiment = new cvs tag | ||
+ | #* New batch = nightly (possibly only if cvs has changed) | ||
+ | # CMake knows of # of CPUs and CPU cores | ||
+ | # CMake knows of memory available | ||
+ | # Implement BMDashboards | ||
+ | |||
+ | === Brad === | ||
+ | # Continue to develop registration pipelines | ||
+ | #* Commit into CVS | ||
+ | #* Implement as ctests | ||
+ | # Optimize the meansquareddifferenceimagetoimagemetric | ||
+ | |||
+ | === Seb === | ||
+ | # Setup CMake Dashboard | ||
+ | # Add md5 encryption function to CMake for BatchMake passwords | ||
+ | # Work with Julien on BatchMake Dashboard designs | ||
+ | # Investigate other opportunities for optimization | ||
+ | |||
+ | === Stephen === | ||
+ | # Get Seb/Brad access to SPL machines | ||
+ | # Continue to optimize MattesMIMetric | ||
+ | # Determine BMDashboard table structure | ||
+ | # Have programs switch between baseline, optimized, and both testing/reporting |
Latest revision as of 15:52, 6 April 2007
Home < ITK Registration Optimization < 2007-04-06-tconContents
Agenda
Tests
- Two types of tests
- Baseline: LinearInterp is useful for profiling
- Profile reports in Reports subdir
- Optimization: OptMattesMI is a self-contained test
- Submit to dashboard
- test name / method
- non-optimized speed
- optimized speed
- optimized error (difference from non-optimized results)
- Submit to dashboard
- Baseline: LinearInterp is useful for profiling
Timing
- Priority
- Thread affinity
Performance Dashboard
- Ranking computers?
- CPU, memory, etc in dashboard
- MFlops as measured by Whetstone?
- Public submission of performance
- Logins and passwords configured in cmake
- Encryption in cmake?
- Organization of Experiments/Dashboards
- When new experiment?
- Appropriate summary statistics
- Per machine: batch -vs- speed/error
- Per test: mflops -vs- speed/error
- All, batch -vs- % change in performance
Review of OptMattesMI
- Lessons learned
- Mutex bad
- Memory good
ctest suite
- Ideal set of tests is machine specific
- e.g., Number of threads and image size
Tasks
Julien
- Work with Seb to get reports from Amber2
- Result
- Amber2 is allocated to another project - therefore work will transition to machines at SPL
- Result
- Define role of experiments and batches
- Work with Seb to integrate with cmake dashboard
- New experiment = new cvs tag
- New batch = nightly (possibly only if cvs has changed)
- CMake knows of # of CPUs and CPU cores
- CMake knows of memory available
- Implement BMDashboards
Brad
- Continue to develop registration pipelines
- Commit into CVS
- Implement as ctests
- Optimize the meansquareddifferenceimagetoimagemetric
Seb
- Setup CMake Dashboard
- Add md5 encryption function to CMake for BatchMake passwords
- Work with Julien on BatchMake Dashboard designs
- Investigate other opportunities for optimization
Stephen
- Get Seb/Brad access to SPL machines
- Continue to optimize MattesMIMetric
- Determine BMDashboard table structure
- Have programs switch between baseline, optimized, and both testing/reporting