Projects:ARRA:SlicerQA

From NAMIC Wiki
Revision as of 02:48, 25 October 2009 by Swallace (talk | contribs)
Jump to: navigation, search
Home < Projects:ARRA:SlicerQA

Back to Slicer ARRA home page

Motivation

A core design feature of the Slicer 3 medical image data analysis and visualization platform, cross platform compatibility, enables the widest possible user community to benefit from this valuable research tool. To support this key functionality, all software code in the core and modules (built-in, loadable, scripted, and command line) must be fully tested on each of the supported platforms (Windows, Solaris, Linux 64 and 32-bit, Mac OSX Intel and PowerPC). All documentation and training materials (tutorials, demonstrations, practice data sets. etc) must also be developed and tested on each supported platform.

With the recent release of Slicer 3 (Stable Release version 3.4) the number of available modules that need testing and training materials has substantially increased. This is compounded by the fact that an increasing number of users are contributing new modules to the consortium. Unlike NA-MIC engineers, most members of the user community develop their algorithms and/or workflows on a single software platform, thus never encounter any bugs or other cross platform incompatibilities that must be resolved.

Funding of this supplement will enable the NA-MIC project team to more effectively and rapidly disseminate Slicer modules and capabilities to the user community and to minimize sources of errors in the use of the software.

Research Plan

The new Q/A engineer will work with the team of Core 2 Engineers overseeing Slicer 3 development to devise an efficient protocol for testing each type of module (built-in, loadable, scripted, and command line) on each supported platform. This testing forms a baseline so we can confirm that new versions of the code can be used to accomplish the same tasks. This workflow will be implemented for all existing Slicer 3.4 modules and applied to new modules as they become part of the standard distribution. The testing protocol will leverage the engineering methodologies adopted by the NA-MIC effort, which emphasize automated testing to the extent technically possible, augmented by human spot checking of functionality that cannot be efficiently automated. Specifically, the Slicer 3 effort relies heavily on core NA-MIC Kit components such as VTK and ITK that include extensive automated testing frameworks to ensure that fundamental operations are successfully completed on all supported platforms. These automatically testable operations include: compiling the code, reading and writing files, performing a defined set of image processing operations, etc. For each of these operations the testing system can confirm that a correct result is obtained. For some aspects of the software, no fully automated testing solutions exist; for example, the addition of a new button on a dialog box may make one or more other buttons impossible to access. For this type of functionality, we must rely on human spot checking of core functions as the code is developed. Our approach in these situations is to capture the essential functionality through the tutorials that describe the module. By assuring that the tutorial can be accomplished successfully on all platforms we develop confidence in proper functionality.

The Q/A engineer will then work with the Training Core team (R. Gollub and S. Pujol) to develop new tutorials or extend existing ones to all supported software platforms for each module as soon as it has reached a level of stability and maturity such that the interface and input/output values are not being rapidly revised.

Key Personnel

Progress

(most recent on top)


Week ending 10/16/09

  • Continued work on SFN 2009 DTI Workshop tutorials. Confirmed that the DTI and 3D Visualization datasets worked in Slicer 3.4 stable version across all platforms. Final revisions to tutorial slides completed.
  • Continued work on Slicer 3.4+ Stochastic Tractography Module (Slicer Daemon).

Week ending 10/23/09

  • EM Segmentation testing, including review and revision of tutorials, bug tracking, and introduction to vervet data. Done with Andriy F.
  • Followup from SfN 2009 DTI Workshop.
  • 2009 RSNA Conference preparation.