2007 Materials for NCBC Program Review
Contents
- 1 Materials requested for NCBC Program Review
- 2 Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.=
- 3 Q2
- 3.1 Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
- 3.2 Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
- 3.3 Q2.3 Are the NCBCs interfacing appropriately? (recommended by RICC)
- 3.4 Q2.4. What new collaborations have been formed through the NCBC initiative?
- 3.5 Q2.5. What new training opportunities have the centers provided?
- 3.6 Q2.6. What changes could make the program more effective in the future?
- 3.7 Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
- 4 Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
- 5 Logistics
Materials requested for NCBC Program Review
These are due to Gwen Jacobs by Friday, June 08, 2007.
Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.=
(Tina/some edits from Marty --Tina--you had schizophrenia listed as a NEW DBP, I put in VCFS--if this was in the annual report, it needs to be fixed. Also, under highlights--- I see what Core 1 was able to do because of data provided by core 3, but there are no highlights of what core 3 did with what Core 1 gave them --- don't you think this should be fixed? That is, there are NO HIGHLIGHTS listed for CORE 3 --)
Summary: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#1._Introduction
The National Alliance for Medical Imaging Computing (NA-MIC) is now in its third year. This Center is comprised of a multi-institutional, interdisciplinary team of computer scientists, software engineers, and medical investigators who have come together to develop and apply computational tools for the analysis and visualization of medical imaging data. A further purpose of the Center is to provide infrastructure and environmental support for the development of computational algorithms and open source technologies, and to oversee the training and dissemination of these tools to the medical research community. The driving biological projects (DBPs) for the first three years of the Center came from schizophrenia, although the methods and tools developed are clearly applicable to many other diseases.
In the first year of this endeavor, our main focus was to develop alliances among the many cores to increase awareness of the kinds of tools needed for specific imaging applications. Our first annual report and all-hands meeting reflected this emphasis on cores, which was necessary to bring together members of an interdisciplinary team of scientists with such diverse expertise and interests. In the second year of the center our emphasis shifted from the integration of cores to the identification of themes that cut across cores and are driven by the requirements of the DBPs. We saw this shift as a natural evolution, given that the development and application of computational tools became more closely aligned with specific clinical applications. This change in emphasis was reflected in the Center's four main themes, which included Diffusion Tensor Analysis, Structural Analysis, Functional MRI Analysis, and the integration of newly developed tools into the NA-MIC Tool Kit. In the third year of the center, collaborative efforts have continued along each of these themes among computer scientists, clinical core counterparts, and engineering partners. We are thus quite pleased with the focus on themes, and we also note that our progress has not only continued but that more projects have come to fruition with respect to publications and presentations from NA-MIC investigators, which are listed on our publications page.
Below, in the next section (Section 2) we summarize our progress over the last year using the same four central themes to organize the progress report. These four themes include: Diffusion Image analysis (Section 2.1), Structural analysis (Section 2.2), Functional MRI analysis (Section 2.3), and the NA-MIC toolkit (Section 2.4). Section 3 highlights four important accomplishments of the third year: advanced algorithm development in Shape and DTI analysis, the newly architected open source application platform, Slicer 3, and our outreach and technology transfer efforts. Section 4 summarizes the impact and value of our work to the biocomputing community at three different levels: within the center, within the NIH-funded research community, and externally to a national and international community. The final section of this report, Section 5, provides a timeline of Center activities.
In addition, the end of the first three years of the center marks a transition from the first set of DBPs that were focused entirely on Schizophrenia to a new set that span a wider range of biological problems. The new DBPs include a focus on Systemic Lupus Erythematosis (MIND Institute, University of New Mexico), Velocardiofacial Syndrome (Harvard), and Autism (University of North Carolina, Chapel Hill), along with a direction that is new but synergistic for NA-MIC: Prostate Interventions (Johns Hopkins University). Funding for the second round of DBPs starts in the next cycle, but the PIs were able to attend the recent All-hands meeting and start developing plans for their future research in NA-MIC.
Finally, we note that Core 3.1 (Shenton and Saykin), are in the process of applying for a Collaborative R01 to expand current research with NA-MIC, which ends on July 31, 2007. Both Drs. Shenton and Saykin have worked for three years in driving tool development for shape measures, DTI tools, and path analysis measures for fMRI as part of the driving biological project in NA-MIC, and they now plan to expand this research in a Collaborative R01 by working closely with Drs. Westin, Miller, Pieper, and Wells, to design, assess, implement, and apply tools that will enable the integration of MRI, DTI, and fMRI in individual subjects, as well as to develop an atlas of functional networks and circuits that are based on a DTI atlas (i.e., structural connectivity), which will be integrated with a network of functional connectivity that will be identified from fMRI probes of attention, memory, emotion, and semantic processing. We mention this here because this will be, to our knowledge, the first DBP to apply for further funding to continue critical work begun with NA-MIC.
Highlights: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#3._Highlights
The third year of the NA-MIC project saw continued development and dissemination of medical image analysis software. The current progress is clearly characterized by a significant increase in the application of Core-1 and Core-2 tools to image data provided by the Core-3 DBP groups. The main reasons for this progress is two-fold, first there are several new methods that are out of the prototype stage and ready to be applied to large sets of imaging datasets, and second there are new high-resolution imaging data from high-field scanners that are more appropriate for the new tools than historical data with often very coarse slice resolution.
With the release of the first version of Slicer3, the transfer of this technology is accelerating. Because of NA-MIC's strong ties with several large open source communities, such as ITK, VTK, and CMake, NA-MIC continues to make significant impact on the nation's broader biocomputing infrastructure. The following are just a few of the many highlights from the third year of the NAMIC effort.
- Advanced Algorithms
Core 1 continues to lead the biomedical community in DTI and shape analysis.
- NA-MIC published an open source framework for shape analysis, including providing access to the open source software repository. Shape analysis has become of increasing relevance to the neuroimaging community due to its potential to precisely locate morphological changes between healthy and pathological structures. The software has been downloaded many times since the first online publication in October 2006, and is now used by several prestigious image analysis groups.
- The spherical based wavelet shape analysis package has been contributed into ITK, and in the next few months the multiscale segmentation work will be incorporated as well.
- The NA-MIC community has implemented a very fast method for the optimal transport approach to elastic image registration which is currently being added to ITK.
- The NA-MIC toolkit includes a comprehensive set of modules for analysis of diffusion weighted images (DWI), including improved calculation of tensors, interpolation, nonlinear deformation and statistics on tensor fields, novel methods for tractography and for optimal path finding, and clustering of sets of streamlines.
- A quantitative tractography package for user-guided geometric parametrization and statistical analysis of fiber bundles (FiberViewer) has been contributed to the NAMIC toolbox. This tool used in several ongoing clinical DTI studies.
- The conformal flattening algorithm has been implemented as an ITK filter and is in the NA-MIC Sandbox in preparation for formal acceptance into the NA-MIC Kit.
- Technology Deployment Platform: Slicer3
Core 2 in conjunction with Algorithms (Core 1) and DBP (Core 3) are creating new tools to accelerate the transition of technology to the biomedical imaging community.
- One of the year's major achievements was the release of the first viable version of Slicer3 application, which evolved from concept to a full-featured application. The second beta version of Slicer3 was released in April 2007. The application provides a full range of functionality for loading, viewing, editing, and saving models, volumes, transforms, fiducials and other common medical data types. Slicer3 also includes a powerful execution model that enables Core 1 developers (and other in the NA-MIC community) to easily deploy algorithms to Core 2 and other biocomputing clients.
- Slicer3's execution model supports plug-in modules. These modules can be run stand alone or integrated into the Slicer3 framework. When integrated, the GUI to the module can be automatically generated from an associated XML file describing input parameters to the module. A variety of modules were created, ranging from simple image processing algorithms, to complex, multi-step segmentation procedures.
- To stress test Slicer3's architecture and demonstrate its capabilities, the EM Segment module (http://wiki.na-mic.org/Wiki/index.php/Slicer3:EM) was created and added to Slicer's library of modules. EM Segment is an automatic segmentation algorithm for medical images and represents a collaborative effort between the NAMIC engineering, algorithms, and biological problem cores. The EM Segment module enables users to quickly configure the algorithm to a variety of imaging protocols as well as anatomical structures through a wizard-style, workflow interface. The workflow tools have been integrated into the NA-MIC Kit, and are now available to all other modules built on the Slicer3 framework.
- Outreach and Technology Transfer
Cores 4-5-6 continue to support, train and disseminate to the NA-MIC community, and the broader biomedical computing community.
- NA-MIC continues to practice the best of collaborative science through its bi-annual Project Week events. These events, which gather key representatives from Cores 1-7 and external collaborators, are organized to gather experts from a variety of domains to address current research problems. This year's first Project Week was held in January and hosted by the University of Utah. It saw several significant accomplishments including the first beta release of the next generation Slicer3 computing platform. The second Project Week is scheduled for June in Boston, MA.
- Twelve NA-MIC-supported papers were published in high-quality peer reviewed conference proceedings (four papers in MICCAI alone). Another paper on the NAMIC software process was published in IEEE Software. All three DTI papers presented at MICCAI last year were NAMIC associated.
- Several workshops through the year were held at various institutions. This includes the DTI workshop at UNC, the MICCAI Open Source Workshop, and the NA-MIC Training Workshop at the Harvard Center for Neurodegeneration and Repair. The training and dissemination will continue with a DTI workshop at the forthcoming Human Brain Mapping meeting and activities at MICCAI 2007, among others.
Q2
A brief statement - (one page per question, max) addressing each of the questions listed below. These are the questions that we have been asked to address in our report. Our goal in asking for this information is to be able to produce a report that reviews the program as a whole. Your view, from the vantage point of the center you direct, is critical to our work. In addition, your answers will provide us with more information that we can use in our discussion with program staff on June 11th. We know that some of this information can be found on your websites, so in those cases a link to the information would be most helpful.
Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
(Eric/Polina)
(From Ross/edits Marty--THIS SECTION I THINK NEEDS MORE WORK AS I DON"T UNDERSTAND HOW IT ANSWERS THE QUESTION BEING ASKED) The fields of biological and medical imaging are exploding. Moreover, the combination of new acquisition and reconstruction techniques, new computing resources, and diverse biological and clinical applications have resulted in a massive proliferation of imaging data. This imaging data has the potential for having an important impact on basic biological science, in the development of new drugs and medical technologies, and in direct clinical practice - in both diagnosis and treatment. Thus, biomedical imaging analysis is one of the most important applications of biomedical computing.
To realize this potential will require new computational tools for image analysis. These tools will rely on a diverse set of technical knowledge including physics, systems, computer science, mathmatics, and statistics. The development of these tools will also incorporate in-depth knowledge of clinical and biological applications in diverse areas such as neuroscience, psychiatry, encology, cardiology, and biochemistry.
These computational tools must be scalable in several ways. First, they must be computationally scalable. Tools for image analysis must be suitable for quantative analyses on large sets of 3D data, and many current software packages for image processing are not appropriate for this. In order to serve computational needs across NIH, image analysis tools must also scale across application domains and address a variety of clinical and biomedical problems. In the spirit of the National Center, these tools must also scale across institutions and research groups; that is, they must be usable and sustainable outside the context of the Center itself.
I DONT REALLY UNDERSTAND WHAT POINT IS BEING MADE IN THE PARAGRAPH BELOW??? (MARTY) The NAMIC NCBC addresses these issues through a variety of mechanisms. First it is truly an national center, with researchers from seven (?) institutions across the US that represent expertise in the wide range of disciplines described above. The distributed nature of the project provides technical ald clinical expertise, but it also enforces openness of the resource - the infrastructure must accessible in order for the center to operate effectively. NAMIC also includes a set of industrial partners in the engineering core (Core 2), who provide a set of scalable tools for developing, maintaining, and distributing software. The software is universally managable and maintainable, and it does not belong to any one research group --- it belongs to the community. The tools area also scalable across appliation domains. HUH??? While the initial DBPs focused on neuroscience, the new DBPs include oncology and associated R01s include biomechanics/orthopedics. Users of the software are even more diverse, and even include fields outside of medicine and biology. The strategy of NAMIC is to make sure that the effort scales over time as well. The liscensing agreement which is enforced on all of our development activities does not restrict use, and thus we anticipate (and promote) commerical use of the NAMIC software so that the impact of the project could be realized beyond the lifetime of the center.
Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
(Tina)
Q2.3 Are the NCBCs interfacing appropriately? (recommended by RICC)
(Will Schroeder)
Q2.4. What new collaborations have been formed through the NCBC initiative?
(Jim Miller)
- discuss what is new: interactions inside cores and between cores.
Q2.5. What new training opportunities have the centers provided?
(Randy Gollub)
- The entire Slicer tutorial portfolio would not exist without NA-MIC. explain what this portfolio consists of.
Q2.6. What changes could make the program more effective in the future?
(Steve Pieper)
- National Visibility
The overall objectives of the NCBC program calls for building computational infrastructure for the nation's biomedical research efforts -- this ambitious goal includes both computational and biomedical science, areas in which the existing centers have extensive experience, but also in nationwide "marketing" efforts where the scientific community is less adept. Several centers, including NA-MIC, have developed novel approaches to this problem through their training and dissemination cores, but as the output of the centers grows along with the number of potential users in the community and the potential impact, these critical resources will be increasingly strained. There are several approaches the overall program could take to address this issue including supplemental funding to host workshops or conferences, providing small travel grants for researchers or students to visit the centers, or actively encouraging a wider range of NIH funded researchers to adopt the tools generated by the NCBCs. In particular, collaboration PAR has been very effective as a mechanism for encouraging busy scientists to consider adopting the NCBC tools and is an excellent example of how the program can extend the impact of the basic investment in scientific infrastructure.
- Local Autonomy
The program should avoid adding extra layers of uniformity to what are fundamentally unique centers. The NCBC program has successfully established a distributed network of centers drawing on the expertise of some of the nation's leading researchers drawn to the program for the opportunity to develop and apply their know-how to this ambitious effort. This rich environment, predictably, yields a diversity of approaches and organizational structures as each of the centers works to implement their particular vision of how to fulfill the overall mission of the NCBC program. Preserving the vitality of the effort depends on retaining this autonomy as each center strives to meet the individual objectives suited to their communities. The program needs well defined goals that each center must meet, but the overall program should facilitate the individual solutions of the center's leadership.
Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
(Martha Shenton)
- "Greater Start Up Time and Steeper Learning Curve Than Anticipated"
Bringing together computer scientists, engineers, and biomedical researchers, who are from very different backgrounds, to work on a core set of biological problems is no easy feat, and, in fact, takes longer than we would have anticipated. In retrospect this is understandable as the initial time was spent in defining the Cores and in forming Core identities that would be brought to bear in challenges presented by the driving biological problem. As described in the first year progress report,
kdkdkd
There are several lessons that have been learned from the initial phases of this NCBC that might help to guide future NIH efforts in biomedical computing. Briefly, they fall into: First, in bringing together computer scientists, engineers, and application scientists, the learning curve is initially much greater than many of us would have anticipated, as the different skills sets of such a multidisclinary enterprise requires that the application scientists who are driving the biological problems, must learn to communicate with the computer scientists and engineers. Coming from different disciplines, this is not always easy and it takes some amount of time to define the problems succinctly, concretely, and with sufficient clarity that computer scientists and engineers understand what tasks and needed to assist in solving the challenges posed by the application scientists, and how the computer scientists and engineers might best develop algorithms to address these challenges. If the scientists coming together have not worked with each other before, and are from different scientific disciplines, there is a time period needed to define the problems and to discuss how they might be approached. This can take anywhere from a year to a year and a half.
Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
(Will Schroeder, Allen Tannenbaum--THE PUBLICATION LIST NEEDS UPDATING. O'Donnell paper, for example was out there a while ago in AJNR--things missing from our group also)
A3:
- NA-MIC Publications are available here: http://www.na-mic.org/Wiki/index.php/Publications.
- NA-MIC Software Tools are available here: http://www.na-mic.org/Wiki/index.php/NA-MIC-Kit
The Center has created and extended a number of software tools to handle some of the key problems in medical imaging analysis and to deliver these computational technologies via a suite of applications, toolkits, and infrastructure facilities. A summary description of these tools includes:
- Slicer3 (application) - a application platform for deploying imaging technologies, newly architected with an execution model facilitating integration, work flow, and large scale computing. While Slicer3 is the newest addition to the NAMIC Kit, it is built on pre-existing, mature toolkits so that the application is relatively mature, and is already in use. Because Slicer3 supports plug-in modules, active development is proceeding to create and package various modules for dissemination to the NAMIC community.
- ITK (toolkit) - a mature system for image analysis, registration and segmentation (initially created in 1999). ITK is in use worldwide for medical imaging research and development.
- VTK (toolkit) - a mature system for visualization, graphics, volume rendering, and interaction (initially created in 1993). VTK is used worldwide for research, teaching and commercial development.
- DART (computational infrastructure) - a key component of the NAMIC quality control process, DART is used to coordinate and report the software testing process. It was created in the first year of NAMIC and is in constant use, therefore a mature system.
- CMake/CTest/CPack (computational infrastructure) - CMake and CTest are relatively mature systems used to manage the building and testing of software across diverse computer platforms. CMake is used worldwide by some of the world's largest open source systems such as KDE. CPack, a recent addition to the NAMIC kit, is used to simplify the packaging and dissemination of software across platforms. Thus in NAMIC we can easily deploy our software across Windows, Linux, Unix, and Mac platforms.
- Other tools (computational infrastructure) - Many other software tools are used to support the development of advanced imaging applications, and to assist with large scale computing, including
- Teem - image processing tools (mature)
- KWWidgets - Open source, cross platform GUI toolkit (mature, but development continues to support workflow).
- BatchMake - Support large-scale computing, including grid computing, for performing large population studies and statistical analysis (under active development).
These tools address key problems in imaging including segmentation, registration, visualization, and shape analysis; or provide facilities supporting researchers and developers who wish to create advanced software applications. One of key characteristics of NAMIC is that we treat the development of advanced medical image analysis software holistically; that is, the complete cycle of algorithm design, efficient impmentations, quality control and dissemination are needed to effectively address challenges provided by the driving biological problems. Examples of how these tools are being used include the following:
(a) Segmentation: Here there are a variety of tools of varying degrees of maturity ranging from the EM Segmentor (a widely distributed, mature algorithm included in Slicer3 as an application plug-in) to more recent work on DTI segmentation based on directional flows which are Matlab and C++ based. Powerful mature tools such as Bayesian segmentation have been recently included in Slicer (and have been available in ITK for some time now) which can be combined with very recent work on the semi-automated segmentation of the DPFC done in collaboration with Core 3 researchers. Further, tools previously developed by NAMIC researchers which have had a wide distribution have been put into Slicer. For example, geometric based segmentation methods (some of which were included in packages marketed by GE) were tailored for cortical segmentation, and included in the Slicer, and in fact even improved with the inclusion of statistical based approaches.
(b) Registration: Similar remarks can be made for registration in which we have a spectrum ranging from very mature methods to very recent ones which are still being tested. In particular, mature widely distributed methodologies for rigid registration are now included in ITK, as well as spline-based registration methodologies. These are well-tested methods which have been made accessible to the general imaging community. Newer methodologies such as those based on optimal transport for elastic registration are being included in ITK. NAMIC has also pushed for fast implementations of its algorithms to be used on cheap widely available platforms. Taking a cue from the game industry, some algorithms have been ported to GPUs (graphics cards) which are being employed now as computing devices. This has led to a speed-up of almost two orders of magnitude on some of the registration algorithms being tested.
(c) Shape Analysis: Again a number of methodologies have been developed and implemented with varying levels of maturity. Shape methodologies based on spherical harmonics are quite mature, and are available in pipelines developed by NAMIC researchers and have been distributed to the general community. A newer spherical based wavelet shape analysis package has been put into ITK, which also drives a novel shape-based segmentation procedure. More globally based spherical harmonic ideas have been combined with the multi-resolution spherical wavelet approach as a statistical shape based package for schizophrenia. This general technique may be used for other purposes as well, and is presently being ported to some work being done on the prostate. Work has also been accomplished on particle based approaches to this important problem area with the code put into ITK. Many times we work with a Matlab/C++ initial version of our codes, then move to ITK, and finally to Slicer. However, even at the Matlab/C++ stage, algorithms have been distributed and used in a clinical setting (for example, rule-based brain segmentation approaches).
(d) Diffusion Weighted Image (DWI) Analysis: A number of tools relevant to diffusion tensor estimation, fiber tractography and geometric and statistical analysis of fiber bundles have been contributed to the NAMIC toolkit. Some of these tools have been already integrated into diffusion dedicated ITK package with GUI as part of the NAMIC toolkit (e.g., FiberViewer-UNC) and into the Slicer platform (BWH tractography, clustering). The impact of NAMIC DWI analysis activities is best characterized by most recent journal articles and journal articles in print. The application of the NAMIC FiberViewer tool (UNC) in large clinical studies at UNC and Duke are in print (Taylor et al, Gilmore et al., Cascio et al.). Clinical application of the Slicer DTI package by BWH/MIT is reported in O’Donnell et al. Kuroki N. et al. and Nakamura et al.. The research by MGH is found in two journal publications by Tuch et al.. The description of the methodologies also appeared or will appear in peer reviewed journals (Corouge et al., Fletcher et al., Corouge et al.). New methods in development (Finsler metric GT, volumetric PDE based path analysis Utah, stochastic tractography BWH), and path of interest MGH, are currently tested on Core-3 DBP data, with the goal to give recommendations on which type of solution is appropriate to solve specific clinical analysis questions.
(e) Visualization: Core 2 researchers involved with NAMIC now (e.g., the founders of Kitware) were at the forefront of developing VTK (and of course ITK). Thus here we are considering technologies which are at a commercial level of development, and used at thousands of sites. Algorithms developed at NAMIC have driven new directions for these packages. Newer visualization methods, for example, the conformal flattening procedure have been ported to an ITK filter and is in the NAMIC Sandbox. Quasi-isometric methods for brain flattening from the MGH Free Surfer have become part of the NAMIC enterprise as well. These flattening procedures are very easy to use, and may also be employed for registration. Code for the control of distortion of the area in flattening has been incorporated which give area-preservation with minimal distortion. The techniques may be also used for several other purposes included automatic fly-throughs in endoscopy (incorporated into Slicer), and for texture mappings for general visualization purposes.
Logistics
When completed, the information should be sent to:
Gwen Jacobs, PhD
Professor of Neuroscience
Asst. CIO and Director of Academic Computing
1 Lewis Hall
Montana State University
Bozeman, MT 59717
406-994-7334 - phone
406-994-7077 - FAX
gwen@cns.montana.edu <mailto:gwen@cns.montana.edu>