ITK Registration Optimization/Testing And Backward Forward Compatibility
Backward Forward Compatibility
This page describes the testing framework used for verifying the backward/forward compatibility of the optimized classes developed in this project with the standard classes currently in ITK.
Compatibility
- Backward Compatibility
- A modified class is backward compatible with a given version "V", when applications that used the version "V" of the class, can adopt the new modified version of the class without having to change their source code to compile it, and by arriving to identical results when using the class at run-time.
- Forward Compatibility
- A modified class is forward compatible with a given version "V", when interactions based on the old API of the class are safely translated to the new API of the class.
Testing
The ONLY way to tell for sure if a class is backward/forward compatible is to have a test for it.
In order to verify the backward/forward compatibility of the optimized classes a testing framework was put in place. This framework generates tests based on combinations of the following components
- Transforms
- Translation
- Rigid (2D/3D)
- Affine
- BSplineDeformable
- Metrics
- MeanSquares
- MutualInformation
- MattesMutualInformation
- Interpolators
- NearestNeighbor
- Linear
- BSpline
These component were mixed in their optimized and non-optimized (standard) versions, in order to create regression tests comparing the result of the non-optimized versions with the results of the optimized versions.
Since this is a combinatorial problem, the source code of the tests was generated using CMake macros, that combines the components in different ways and prepares their corresponding initializations.
The choice of optimizers was mostly driven by the Transform, according to the following table:
Transform | Optimizer |
---|---|
Translation | RegularStepGradientDescent |
Rigid2D | RegularStepGradientDescent |
Rigid3D | VersorRigid3DTransform |
Affine | RegularStepGradientDescent |
BSplineDeformable | LBFGSB |
However, inconsistencies in the current API of some metrics and some optimizers induced the following variations
MeanSquares | MutualInformation | MattesMutualInformation | |
---|---|---|---|
Translation | RegularStepGradientDescent | RegularStepGradientDescent | RegularStepGradientDescent |
Rigid2D | RegularStepGradientDescent | RegularStepGradientDescent | RegularStepGradientDescent |
Rigid3D | VersorRigid3DTransform | VersorRigid3DTransform | VersorRigid3DTransform |
Affine | RegularStepGradientDescent | RegularStepGradientDescent | RegularStepGradientDescent |
BSplineDeformable | LBFGSB | RegularStepGradientDescent | LBFGSB |
Issues to Probe Further
- Dashboard
- MattesMutualInformation + Translation + NearestNeighbors fails to register in both 000 and 111, this seems to indicate a problem in the computation of derivatives when a nearest neighbor interpolator is used
- Disimilar convergence rates: In the combination of MeanSquares + Affine + Linear, the metric values per iteration of the optimizer converge about half speed in the 111 versus the 000 test. This seems to indicate a problem in the computation of derivatives. GNUplots of convergence rates have been added to the nightly tests. These plots should probably be converted to BatchMake plots.
- Butterfly effect: slight variation of a parameter changes the result of a registration: MeanSquares + Affine + BSpline interpolator, using a Step length of 1.0 the optimizer gets stuck in a point in the parametric space. See cvs log of Code/Testing/itkOptimized/CMakeLists.txt version 1.57
- Mutual Information metric 000 & 111 require a different number of samples. Is that an unfair comparison ?, knowing that computation time is a function of the number of samples.
- LBFGSB Optimizer doesn't have the MaximizeOn() MinimizeOn() methods int is API. It minimizes by default. Therefore it can't be used for MutualInformation metric, since this metric must be Maximized.
- LBFGS Optimizer doesn't have the methods GetValue(), GetDerivative(), GetCurrentIteration(),therefore it can not be used from the standard observer of the optimizer.
- MutualInformation metric "could" have a method for negating the output, so that it could be use by minimizing optimizers.