2007 Materials for NCBC Program Review
Contents
- 1 Materials requested for NCBC Program Review
- 2 Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.
- 3 Q2
- 3.1 Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
- 3.2 Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
- 3.3 Q2.3 Are the NCBCs interfacing appropriately?
- 3.4 Q2.4. What new collaborations have been formed through the NCBC initiative?
- 3.5 Q2.5. What new training opportunities have the centers provided?
- 3.6 Q2.6. What changes could make the program more effective in the future?
- 3.7 Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
- 4 Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
- 5 Logistics
Materials requested for NCBC Program Review
These are due to Gwen Jacobs by Friday, June 08, 2007.
Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.
Summary: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#1._Introduction
The National Alliance for Medical Imaging Computing (NA-MIC) is now in its third year. This Center is comprised of a multi-institutional, interdisciplinary team of computer scientists, software engineers, and medical investigators who have come together to develop and apply computational tools for the analysis and visualization of medical imaging data. A further purpose of the Center is to provide infrastructure and environmental support for the development of computational algorithms and open source technologies, and to oversee the training and dissemination of these tools to the medical research community. The driving biological projects (DBPs) for the first three years of the Center came from schizophrenia, although the methods and tools developed are clearly applicable to many other diseases.
In the first year of this endeavor, our main focus was to develop alliances among the many cores to increase awareness of the kinds of tools needed for specific imaging applications. Our first annual report and all-hands meeting reflected this emphasis on cores, which was necessary to bring together members of an interdisciplinary team of scientists with such diverse expertise and interests. In the second year of the center our emphasis shifted from the integration of cores to the identification of themes that cut across cores and are driven by the requirements of the DBPs. We saw this shift as a natural evolution, given that the development and application of computational tools became more closely aligned with specific clinical applications. This change in emphasis was reflected in the Center's four main themes, which included Diffusion Tensor Analysis, Structural Analysis, Functional MRI Analysis, and the integration of newly developed tools into the NA-MIC Tool Kit. In the third year of the center, collaborative efforts have continued along each of these themes among computer scientists, clinical core counterparts, and engineering partners. We are thus quite pleased with the focus on themes, and we also note that our progress has not only continued but that more projects have come to fruition with respect to publications and presentations from NA-MIC investigators, which are listed on our publications page.
Below, in the next section (Section 2) we summarize our progress over the last year using the same four central themes to organize the progress report. These four themes include: Diffusion Image analysis (Section 2.1), Structural analysis (Section 2.2), Functional MRI analysis (Section 2.3), and the NA-MIC toolkit (Section 2.4). Section 3 highlights four important accomplishments of the third year: advanced algorithm development in Shape and DTI analysis, the newly architected open source application platform, Slicer 3, and our outreach and technology transfer efforts. Section 4 summarizes the impact and value of our work to the biocomputing community at three different levels: within the center, within the NIH-funded research community, and externally to a national and international community. The final section of this report, Section 5, provides a timeline of Center activities.
In addition, the end of the first three years of the center marks a transition from the first set of DBPs that were focused entirely on Schizophrenia to a new set that span a wider range of biological problems. The new DBPs include a focus on Systemic Lupus Erythematosis (MIND Institute, University of New Mexico), Velocardiofacial Syndrome (Harvard), and Autism (University of North Carolina, Chapel Hill), along with a direction that is new but synergistic for NA-MIC: Prostate Interventions (Johns Hopkins University). Funding for the second round of DBPs starts in the next cycle, but the PIs were able to attend the recent All-hands meeting and start developing plans for their future research in NA-MIC.
Finally, we note that Core 3.1 (Shenton and Saykin), are in the process of applying for a Collaborative R01 to expand current research with NA-MIC, which ends on July 31, 2007. Both Drs. Shenton and Saykin have worked for three years in driving tool development for shape measures, DTI tools, and path analysis measures for fMRI as part of the driving biological project in NA-MIC, and they now plan to expand this research in a Collaborative R01 by working closely with Drs. Westin, Miller, Pieper, and Wells, to design, assess, implement, and apply tools that will enable the integration of MRI, DTI, and fMRI in individual subjects, as well as to develop an atlas of functional networks and circuits that are based on a DTI atlas (i.e., structural connectivity), which will be integrated with a network of functional connectivity that will be identified from fMRI probes of attention, memory, emotion, and semantic processing. We mention this here because this will be, to our knowledge, the first DBP to apply for further funding to continue critical work begun with NA-MIC.
Highlights: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#3._Highlights
The third year of the NA-MIC project saw continued development and dissemination of medical image analysis software. The current progress is clearly characterized by a significant increase in the application of Core-1 and Core-2 tools to image data provided by the Core-3 DBP groups. The main reasons for this progress is two-fold, first there are several new methods that are out of the prototype stage and ready to be applied to large sets of imaging datasets, and second there are new high-resolution imaging data from high-field scanners that are more appropriate for the new tools than historical data with often very coarse slice resolution.
With the release of the first version of Slicer3, the transfer of this technology is accelerating. Because of NA-MIC's strong ties with several large open source communities, such as ITK, VTK, and CMake, NA-MIC continues to make significant impact on the nation's broader biocomputing infrastructure. The following are just a few of the many highlights from the third year of the NAMIC effort.
- Advanced Algorithms
Core 1 continues to lead the biomedical community in DTI and shape analysis.
- NA-MIC published an open source framework for shape analysis, including providing access to the open source software repository. Shape analysis has become of increasing relevance to the neuroimaging community due to its potential to precisely locate morphological changes between healthy and pathological structures. The software has been downloaded many times since the first online publication in October 2006, and is now used by several prestigious image analysis groups.
- The spherical based wavelet shape analysis package has been contributed into ITK, and in the next few months the multiscale segmentation work will be incorporated as well.
- The NA-MIC community has implemented a very fast method for the optimal transport approach to elastic image registration which is currently being added to ITK.
- The NA-MIC toolkit includes a comprehensive set of modules for analysis of diffusion weighted images (DWI), including improved calculation of tensors, interpolation, nonlinear deformation and statistics on tensor fields, novel methods for tractography and for optimal path finding, and clustering of sets of streamlines.
- A quantitative tractography package for user-guided geometric parametrization and statistical analysis of fiber bundles (FiberViewer) has been contributed to the NAMIC toolbox. This tool used in several ongoing clinical DTI studies.
- The conformal flattening algorithm has been implemented as an ITK filter and is in the NA-MIC Sandbox in preparation for formal acceptance into the NA-MIC Kit.
- Technology Deployment Platform: Slicer3
Core 2 in conjunction with Algorithms (Core 1) and DBP (Core 3) are creating new tools to accelerate the transition of technology to the biomedical imaging community.
- One of the year's major achievements was the release of the first viable version of Slicer3 application, which evolved from concept to a full-featured application. The second beta version of Slicer3 was released in April 2007. The application provides a full range of functionality for loading, viewing, editing, and saving models, volumes, transforms, fiducials and other common medical data types. Slicer3 also includes a powerful execution model that enables Core 1 developers (and other in the NA-MIC community) to easily deploy algorithms to Core 2 and other biocomputing clients.
- Slicer3's execution model supports plug-in modules. These modules can be run stand alone or integrated into the Slicer3 framework. When integrated, the GUI to the module can be automatically generated from an associated XML file describing input parameters to the module. A variety of modules were created, ranging from simple image processing algorithms, to complex, multi-step segmentation procedures.
- Slicer3's execution model is also allowing researchers to access large computational resources. Two applications that support the use of such resources, BatchMake and GridWizard, have been recently integrated with Slicer3 for specific modules. The GUI for the modules that are to be run via these computational support applications allow for the creation of large population studies or parametric studies of an algorithm.
- To stress test Slicer3's architecture and demonstrate its capabilities, the EM Segment module (http://wiki.na-mic.org/Wiki/index.php/Slicer3:EM) was created and added to Slicer's library of modules. EM Segment is an automatic segmentation algorithm for medical images and represents a collaborative effort between the NAMIC engineering, algorithms, and biological problem cores. The EM Segment module enables users to quickly configure the algorithm to a variety of imaging protocols as well as anatomical structures through a wizard-style, workflow interface. The workflow tools have been integrated into the NA-MIC Kit, and are now available to all other modules built on the Slicer3 framework.
- Outreach and Technology Transfer
Cores 4-5-6 continue to support, train and disseminate to the NA-MIC community, and the broader biomedical computing community.
- NA-MIC continues to practice the best of collaborative science through its bi-annual Project Week events. These events, which gather key representatives from Cores 1-7 and external collaborators, are organized to gather experts from a variety of domains to address current research problems. This year's first Project Week was held in January and hosted by the University of Utah. It saw several significant accomplishments including the first beta release of the next generation Slicer3 computing platform. The second Project Week is scheduled for June in Boston, MA.
- Twelve NA-MIC-supported papers were published in high-quality peer reviewed conference proceedings (four papers in MICCAI alone). Another paper on the NAMIC software process was published in IEEE Software. All three DTI papers presented at MICCAI last year were NAMIC associated.
- Several workshops through the year were held at various institutions. This includes the DTI workshop at UNC, the MICCAI Open Source Workshop, and the NA-MIC Training Workshop at the Harvard Center for Neurodegeneration and Repair. The training and dissemination will continue with a DTI workshop at the forthcoming Human Brain Mapping meeting and activities at MICCAI 2007, among others.
Q2
A brief statement - (one page per question, max) addressing each of the questions listed below. These are the questions that we have been asked to address in our report. Our goal in asking for this information is to be able to produce a report that reviews the program as a whole. Your view, from the vantage point of the center you direct, is critical to our work. In addition, your answers will provide us with more information that we can use in our discussion with program staff on June 11th. We know that some of this information can be found on your websites, so in those cases a link to the information would be most helpful.
Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
The RFA for the creation of the NCBC program laid out a very explicit vision.
The NCBCs are to be the core of the networked national effort to build the computational infrastructure for biomedical computing in the US. From the NIH website in 2004: Four new National Centers for Biomedical Computing (NCBC) will develop and implement the core of a universal computing infrastructure that is urgently needed to speed progress in biomedical research. The centers will create innovative software programs and other tools that enable the biomedical community to integrate, analyze, model, simulate, and share data on human health and disease. The original RFA stated: 1. The software should be freely available to biomedical researchers and educators in the non-profit sector and 2. The terms of software availability should permit the commercialization of enhanced or customized versions of the software.
As the amount of data produced by biomedical researchers is increasing at an ever accelerating pace, the relative importance of computing as an integral part of analysis is increasing as well. In addition both data and analyses are getting more and more complex. Increasingly, science is performed by interdisciplinary teams at multiple locations. Industrial strength research platforms are one of the emerging needs of biomedical research. This need was only been partially addressed before the creation of the NCBC program. The NCBC centers are in an outstanding position to develop stable, maintainable, and expandable software to address these needs.
In addition to building the infrastructure for biomedical computing, NCBCs play an important role in fostering a community of computational scientists dedicated to solving problems in the biomedical domain. In its short three-year life span, NA-MIC NCBC has contributed to this vision by holding numerous workshops and tutorials on open source software for biomedical image analysis, by supporting the Insight Journal, a peer-review venue for publications accompanied with open-source implementation, and by engaging a large number of graduate students in Computer Science programs in biomedical computing research. By creating high-profile bio-computation programs, the NCBC initiative brings the biomedical computing as a field of research to the attention of the computation community and actively promotes collaborations between computational and biological sciences.
The participants in our center signed on for this project because they believe in the vision of developing a universal computing infrastructure for medical image computing. NA-MIC is creating an open source platform that embodies this vision for the field of medical image computing. We have settled on a very liberal open source license for our software plattform. All the components of that plattform, called the NA-MIC kit, are distributed under a BSD style licence without restrictions on commercial use. Other centers have adopted different strategies to address the requirement for open source and to enable commercial use, as required in the original RFA; we believe the NA-MIC liberal license approach minimizes barriers to wide adoption and is will maximize return on the NIH investment. The availability of software platforms commoditizes infrastructure for research and allows individual researchers to spend more time on their core research. Over time, this will promote biomedical computing by lowering the hurdle for scientists to use the technology.
Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
Ron:
It is too early to say how the NCBC initiative advanced biomedical computing. 4 of the NCBC's have been funded since October 2004 (less than three years). Funding for the other three began in late 2005. Center efforts of this size need significant time and effort to get organized and to synchornize the activities of the participants. This startup effort is where the primary focus of the centers has been until now. There are early signs that some of the centers are beginning to emerge from this phase of their evolution and turning toward activities aimed at the field at large. However, it will be several years before the full impact of this program will become visible.
Below is a more detailed discussion of the specifics of this evolution from the vantage point of NA-MIC
(Tina)
In NA-MIC's third year, it is evident that NA-MIC is developing a culture, environment, and resources to foster and incite collaborative research in medical image analysis that draws together mathematicians, computer scientists, software engineers, and clinical researchers. These artefacts of NA-MIC impact how NA-MIC operates, make NA-MIC a fulcrum for NIH funded research, and draws new collaborators from across the country and around the world to NA-MIC.
- Impact within the Center
- Within the center, the NA-MIC organization, NA-MIC processes, and the NA-MIC calendar has permeated the research. The organization is nimble, forming ad hoc distributed teams within and between cores to address specific biocomputing tasks. Information is shared freely on the NA-MIC Wiki, on the weekly Engineering telephone conferences, and in the NA-MIC Subversion source code repository. The software engineering tools of CMake, Dart 2 and CTest, CPack, and KWWidgets facilitate a cross platform software environment for medical image analysis that be easily built, tested, and distributed to end-users. Core 2 has provided a platform, Slicer 3, that allows Core 1 to easily integrate new technology and deliver this technology in an end user application to Core 3. Core 1 has developed a host of techniques to apply to structural and diffusion analysis which are under evaluation by Core 3. Major NA-MIC events, such as the annual All Hands Meeting, the Summer Project Week, the Spring Algorithms meeting, and Engineering Teleconferences are avidly attended by NA-MIC researchers as opportunities to foster collaborations.
- Impact within NIH Funded Research
- Within NIH funded research, NA-MIC continues to forge relationships with other large NIH funded projects such as BIRN, caBIG, NAC, and IGT. Here, we are sharing the NA-MIC culture, engineering practices, and tools. The BIRN infrastructure, built on widely-accepted grid middleware, allows NA-MIC researchers to share data, access computational resources and provides a rich collaborative environment through a science portal. caBIG lists the 3D Slicer among the applications available on the National Cancer Imaging Archive. NAC and IGT use the NA-MIC infrastructure and are involved in the development of the 3D Slicer. BIRN recently held an event modeled after the NA-MIC Project Week. NA-MIC has become a resource on open source licensing to the medical image analysis community.
NA-MIC is also attracting NIH funded collaborations. Two grants have been funded under PAR-05-063 to collaborate with NA-MIC: Automated FE Mesh Development and Measuring Alcohol and Stress Interactions with Structural and Perfusion MRI. Five additional applications to collaborate with NA-MIC via the NCBC collaborative grant mechanism are under consideration. Additional grant applications submitted under other calls are planning to use and extend the NA-MIC tools.
- National and International Impact
- NA-MIC events and tools garner national and international interest. There were nearly 100 participants at the NA-MIC All Hands Meeting in January 2007, with many of these participants from outside of NA-MIC. Several researchers from outside the NA-MIC community have attended the Summer Project Weeks and the Winter Project Half-Weeks to gain access to the NA-MIC tools and people. These external researchers are contributing ideas and technology back into NA-MIC.
- Components of the NA-MIC kit are used globally. The software engineering tools of CMake, Dart 2 and CTest are used by many open source projects and commercial applications. For example, the K Desktop Environment (KDE) for Linux and Unix workstations uses CMake and Dart. KDE is one of the largest open source projects in the world. Many open source projects and commercial products are benefiting from the NA-MIC related contributions to ITK and VTK. Finally, Slicer 3 is being used as an image analysis platform in several fields outside of medical image analysis, in particular, biological image analysis, astronomy, and industrial inspection.
- NA-MIC co-sponsored the Workshop on Open Science at the Medical Image Computing and Computer-Assisted Intervention (MICCAI) 2006 conference. The proceedings of the workshop are published on the electronic Insight Journal, another NIH-funded activity.
- Over 50 NA-MIC related publications have been produced since the inception of the center.
Q2.3 Are the NCBCs interfacing appropriately?
(Will Schroeder)
At this point in the project cycle, the NCBC's have focused on interactions with the DBPs, developing requirements, creating foundational technology, and integrating existing technologies within the Center. In many cases such as with NAMIC, the resulting technologies are actively being disseminated to the research community. It is only recently that outreach across NCBCs is an effective use of resources. There have been initial efforts towards collaboration, such as cross Center use of software tools (for example, SimBios/SimTk actively uses NAMICs VTK toolkit), and the promising Software and Data Integration Working Group (SDIWG). In that sense, the answer is yes, the NCBC's are interfacing appropriately at this time. Within the next few years there will be a need to increase the extent of interfacing significantly. That will require either reallocation of resources or the allocation of new additional resources. Due to the large variety of approaches adopted by the different centers it will be likely necessary to to allocate significant amounts of time by both senior leadership at the centers for identifying objectives of such interface efforts and by the engineering cores to actually execute the plans. A serious effort in this direction will probably require allocation of multiple FTE's.
Ron:
As discussed in other sections of this response, it is too early to say. Most of the NCBC's are still in the startup phase and have not yet fully finished to develop their core portfolio. Their focus of intellectual attention is aimed at inside. While there have been some interactions and concerted efforts (SDIWG), it is too early in the life cycle of the centers to engage with each others in a more involved way. In that sense, the answer is yes, the NCBC's are interfacing appropriately at this time. Within the next few years there will be a need to increase the extent of interfacing significantly. That will require either reallocation of resources or the allocation of new additional resources. Due to the large variety of approaches adopted by the different centers it will be likely necessary to to allocate significant amounts of time by both senior leadership at the centers for identifying objectives of such interface efforts and by the engineering cores to actually execute the plans. A serious effort in this direction will probably require allocation of multiple FTE's.
Q2.4. What new collaborations have been formed through the NCBC initiative?
(Jim Miller)
NA-MIC’s structure and organization has facilitated many new collaborations. NA-MIC is a distributed center, bringing together mathematicians, computer scientists, software engineers, and clinicians from multiple sites. This distributed structure provided two types of new collaborations within NA-MIC: new collaborations between cores and new collaborations within cores. For between core collaborations, many of the algorithm and engineering core researchers had not collaborated previously with the researchers in either the first or second round of Driving Biological Projects (DBPs). Thus, the NCBC provided a unique opportunity for the algorithm and engineering core researchers to gain clinical insight and to adapt and tune their algorithms and tools to new clinical contexts. Conversely, the DBPs gained access to algorithms and tools that they previously had not utilized. Similarly, many of the algorithm core researchers and engineering core researchers had not previously collaborated. Thus, the NCBC exposed the researchers in the algorithm core to the tools and engineering practices of the engineering core and exposed the researchers in the engineering core to the computational techniques and data structures utilized by the algorithms core. For within core collaborations, many of the researchers within the algorithm core had not previously collaborated. Through NA-MIC, these researchers have been able to cooperate and also amicably compete to address the issues brought forth by the DBPs.
Below is a list of new collaborations within NA-MIC. This list was compiled from the complete list of NA-MIC Collaborations and project lists from the 5 NA-MIC Project week events. Best effort was made to filter these lists down to just the new collaborations (groups of researchers) formed under NA-MIC.
- Georgia Tech + UC Irvine – Rule based segmentation algorithm for DLPFC
- Georgia Tech + Kitware - Knowledge-based Bayesian classification and segmentation
- BWH + MIT + Kitware - Brain tissue classification and subparcellation of brain structures
- Georgia Tech + UNC + BWH - Multiscale shape segmentation techniques
- BWH + Dartmouth + UNC + Georgia Tech - Shape analysis of the caudate and corpus callosum
- Georgia Tech + UNC + BWH - Spherical wavelet based shape analysis for Caudate
- Georgia Tech + UNC + BWH - Multiscale shape analysis of the hippocampus
- Dartmouth + UNC + BWH - Shape analysis of tbe hippocampus
- Utah + UNC + BWH - Automated shape model construction
- Dartmouth + Isomics - Neural substrates of apathy in schizophrenia
- Georgia Tech + GE Research + Kitware - Spherical Wavelet Transforms
- Georgia Tech + UNC - Shape analysis with Spherical Wavelets
- Utah + UNC - Adaptive, particle-based sampling for shapes and complexes
- UNC + Utah + Harvard - Tensor estimation and Monte-Carlo simulation
- Harvard + MIT + UNC - Corpus Callosum Regional FA analysis in Schizophrenia
- Dartmouth + MGH + Isomics + BWH - Integrity of Fronto-Temporal Circuitry in Schizophrenia using Path of Interest Analysis
- MGH + Isomics - ITK implementation of POIStats, and Integration into Slicer3
- UC Irvine + MGH + UNC + MIT - DTI Validation
- Utah + UNC + GE Research - DTI Software and Algorithm Infrastructure
- Utah + BWH - Tensor based statistics
- Utah + BWH - Diffusion tensor image filtering
- MGH + Dartmouth + Kitware + GE Research - Non-rigid EPI registration
- Dartmouth + BWH - Neural Substrates of Working Memory in Schizophrenia: A Parametric 3-Back Study
- Dartmouth + BWH - Brain Activation during a Continuous Verbal Encoding and Recognition Task in Schizophrenia
- Dartmouth + BWH - Fronto-Temporal Connectivity in Schizophrenia during Semantic Memory
- UC Irvine + Toronto - Imaging Phenotypes in Schizophrenics and Controls
- MIT + Isomics + GE Research + Kitware - fMRI statistics software
- MIND + Isomics + MGH - Analysis of Brain Lesions in Neuropsychiatric Systemic Lupus Erythematosis
- JHU + Queen's + BWH + Georgia Tech - Segmentation and Registration Tools for Robotic Prostate Interventions
- UNC + GE Research - Longitudinal MRI study of early brain development
- BWH + Kitware + MIT - Velocardiofacial Syndrome (VCFS) as a genetic model for schizophrenia
- UNC + GE Research + BWH - DTI population analysis
- Georgia Tech + BWH - Geodesic tractography
- BWH + Queen's + GE Research - Display optimization
- UCSD + Isomics - Dendritic Spine Morphometrics
- <others to be filled in>
NA-MIC has also attracted researchers from the field who were not originally part of NA-MIC. Some of these new collaborations are formally organized using the NIH NCBC Collaborative R01 program. But other collaborations are being driven solely by the opportunity to share resources, techniques, capabilities, and ideas.
Below is a list of new collaborations between external researchers and NA-MIC. Again, best effort was made to only list the new collaborations with external parties.
- Mario Negri + GE Research – Integration of vmtk with Slicer 3
- Iowa + Isomics - Finger Bone Biomechanics
- CalTech + Kitware - Systems Biology and Genomic Science
- BWH + Wake Forest + Virginia Tech - Alcohol Stress in Primates
- BWH + MGH - Radiation Treatment Planning'
- Iowa + Kitware + BWH - Non Linear Registration Tools
- Northwestern + Isomics - Radiology Translation Station
- Harvard IIC + Isomics + GE Research - Astronomy Analysis and Visualization
- Virginia Tech + BWH - Applying EMSegmenter to nonhuman primate neuroimaging
- JHU + Queen's + BWH - Brachytherapy needle positioning robot integration
- Iowa + Kitware + BWH - Nonrigid registration
- Iowa + BWH - Developing electronic atlas
- Iowa + Kitware - GUI for nonridig image registration
- Canary Islands Technological Institute + Isomics + GE Research - DICOM Query/Retrieve
- Canary Islands Technological Institute + GE Research - Block matching registration
- UNC + Duke University Medical Center - DTI tractography analysis in depression study
- <others to be filled in>
The collaborative nature of NA-MIC is exemplified by the attendance at the NA-MIC All Hands Meeting and the NA-MIC Summer Project Week. Researchers from within and external to NA-MIC come together at these two events to forge collaborations. At the 2007 NA-MIC All Hands Meeting alone, there were 96 attendees: 56 NA-MIC researchers, 32 NA-MIC collaborators from 13 institutions, and 8 members of the External Advisory Board and NIH. At the Project Half-Week run in conjunction with the All Hands Meeting, there were 38 projects: 16 initiated from the algorithm core, 10 specific to the engineering core, and 11 from external collaborators.
More detailed information on collaborations as well as Project Week events can be found at:
- http://www.na-mic.org/Wiki/index.php/NA-MIC_Collaborations
- http://www.na-mic.org/Wiki/index.php/Engineering:Programming_Events
Q2.5. What new training opportunities have the centers provided?
(Randy Gollub/Ron Kikinis)
- The requirement for all NCBC's to dedicate funds for training provides the opportunity to develop a targeted and deep portfolio of training resources. The unique perspective of providing these training opportunities and resources specifically targeted to a multi-disciplinary audience of basic and clinical biomedical scientists, computer scientists and medical imaging scientists is fostered by the NCBC Program. Traditional funding by NIH research grants results in allocation of all funds into the primary research; NIH training grants allocate funding for the support of trainees; the few education grants are restricted in budget and overhead thus severely limiting the quality and quantity of educational resources that can be offered. The NCBCs have created a new opportunity for our cadre of experienced clinician scientists, computer scientists and medical image analysis experts affiliated with our large centers to be supported to work on outreach activities.
- Within NA-MIC, this perspective has given rise to a thriving training program that supports the biomedical research community within NAMIC, across the NIH community and around the world. The strong demand for our training resources is evident from the large number of hits to our training web pages, from the rapid enrollment in all offered workshops, and the positive feedback from participants. We believe, and our belief is supported by the documented backgrounds of our workshop attendees, that a key aspect of our training materials that makes them useful to the community is that they are learner-centered, goal-oriented, and targeted to bridge the gaps in technical knowledge and language that exist between basic and clinical biomedical scientists, computer scientists and medical imaging scientists. For example, a tutorial that teaches how to use Slicer to register two images includes not only the necessary details of how to implement the algorithm, but also the conceptual framework for the registration approach, the mathematical underpinnings of the algorithm and a detailed anatomical approach for visually inspecting and refining the registration. This rich, but simple approach provides a consistently educational experience for every new user of the NAMIC toolkit.
- NAMIC supported new training opportunities are developed to maximize impact on the wider scientific community. The primary vehicle for this is "Slicer 101", our portfolio of Slicer training tutorials (http://www.na-mic.org/Wiki/index.php/Slicer:Workshops:User_Training_101). We have focused our efforts on making all our tutorial materials available via the NA-MIC Wiki as downloadable Powerpoint presentations and accompanying curated, anonymized datasets. The tutorials are all carefully tested on multiple computer platforms and by our team before being used in live Workshops (http://www.na-mic.org/Wiki/index.php/Training:Events_Timeline). Refinements are made based on the feedback of the audience and our experience during the teaching sessions. The final product of our work allows any new users, regardless of educational background, to not only use the NAMIC tools and algorithms, but to understand what they are doing and why. To date we have had over 7,880 hits to the Slicer 101 webpage.
- NAMIC supported Workshops are another unique venue for multi-disciplinary training. In addition to the all the points made regarding the content of the training materials, the 14 Workshops run by the NAMIC Training core over the past 3 years have each provided the opportunity for new connections to be made among basic and clinical biomedical scientists, computer scientists and medical imaging scientists. All our Workshops provide opportunities for formal and informal discussions among attendees of diverse backgrounds and strengths. These hands-on, interactive workshops allow participants to translate concepts of medical image processing into skills through instructor-led training. The simplicity of our approach, and the exceptional quality of the NAMIC toolkit, ensures a very high success rate for knowledge and skill acquisition. We estimate that 370 people from 52 different universities and companies attended our Workshops between 2005 and 2006.
- We are currently focusing our efforts on reaching a wider community by delivering a more didatic based Workshop in conjunction with the upcoming Organization for Human Brain Mapping meeting in Chicago next week. We held our enrollment to 50 so that we could offer the same hands-on interactive training experience to the attendees and our registration filled within a few weeks of the offering being posted. Tenatively, we anticipate that more than 12 countries and 14 states within the US will be represented at this upcoming Workshop.
- A final point is that this commitment and focus on training permeates all aspects of the NAMIC program. All large gatherings of NAMIC personnel including All Hands Meetings, Programming/Project Weeks, and Core meetings provide venues for our culture of training to be expressed. Each gathering creates an opportunity to build bridges between our participating disciplines and to improve the communication skills of each member. This culture includes implicit aspects such as a supportive and collegial environment that encourages questions and critical feedback, as well as explicit aspects such as encouraging junior level participants to make presentations and scheduling educational presentations from domain experts within and outside of the community. We believe that the positive attitude towards sharing knowledge and skills is fostered in all who are associated with our Project.
Q2.6. What changes could make the program more effective in the future?
(Steve Pieper)
- National Visibility
The overall objective statement of the NCBC program calls for building computational infrastructure for the nation's biomedical research efforts -- this ambitious goal includes both computational and biomedical science, areas in which the existing centers have extensive experience, but also in nationwide "marketing" efforts where the scientific community is less adept. Several centers, including NA-MIC, have developed novel approaches to this problem through their training and dissemination cores, but as the output of the centers grows along with the number of potential users in the community and the potential impact, these critical resources will be increasingly strained. There are several approaches the overall program could take to address this issue including supplemental funding to host workshops or conferences, providing small travel grants for researchers or students to visit the centers, or actively encouraging a wider range of NIH funded researchers to adopt the tools generated by the NCBCs. In particular, collaboration PAR has been very effective as a mechanism for encouraging busy scientists to consider adopting the NCBC tools and is an excellent example of how the program can extend the impact of the basic investment in scientific infrastructure.
- Local Autonomy
The program should avoid adding extra layers of uniformity to what are fundamentally unique centers. The NCBC program has successfully established a distributed network of centers drawing on the expertise of some of the nation's leading researchers drawn to the program for the opportunity to develop and apply their know-how to this ambitious effort. This rich environment, predictably, yields a diversity of approaches and organizational structures as each of the centers works to implement their particular vision of how to fulfill the overall mission of the NCBC program. Preserving the vitality of the effort depends on retaining this autonomy as each center strives to meet the individual objectives suited to their communities. The program needs well defined goals that each center must meet, but the overall program should facilitate the individual solutions of the center's leadership.
- Stability
One critical mission of the NCBC program is the creation of infrastructure. These large-scale efforts require time-horizons that are incompatible with the current framework of many programs at NIH. It might take two or three iterations until a software package is sufficiently mature to be attractive to a larger community of scientists. Large software platforms require engineering staff of 10-20 in addition to the biomedical scientists developing functionality aimed at solving a particular problem. Many of the NCBCs are working on several packages. This results in underfunded projects which compromises the timeliness and performance of the resulting software.
It would be advisable to increase funding for the engineering and outreach activities of the centers and provide the funding in a reliable way. The continous stream of cuts from of the original budget has made this discrepancy even more pronounced. In 2007/2008 we will receive only 77.7% of the money that was budgeted in the application for that year. Following the RFA guidelines, the original budget did not contain adjustments for inflation. Furthermore, in deviation from the way that most NIH programs are funded, the budget was frozen in total dollars not direct dollars. Institutional overhead rates, and fringe and benefit rates have increased for several of the NA-MIC participants during the last three years and have resulted in further decreases in the amount of money available to actually do research.
Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
(Martha Shenton) COMMENT from Randy- I recommend that folks from Cores 1 & 2 also contribute here to give a more balanced response. All of Marty's points are well taken, but I wonder if there are other points also to be made here that will be valuable.
- Greater Start Up Time and Steeper Learning Curve Than Anticipated
Bringing together computer scientists, engineers, and biomedical researchers, with diverse interests, training, and background, for the purpose of working on a set of biological problems, is no easy feat and, in retrospect required a steeper learning curve than was anticipated. This steeper learning curve is understandable, since the main focus initially was on developing alliances among the cores in order to increase awareness about the kinds of tools needed for the specific imaging problems posed by the biomedical researchers who were driving the biological problems. The first year of the grant, as noted in our annual report, thus reflected a "core" emphasis, as an interdisciplinary team was brought together, many members for the first time. It was not until the second year of the grant that the focus on a "core" emphasis shifted to a focus on "themes", which cut across "core" boundaries. While this shift was viewed as part of a natural evolution, now that we know this, we can help guide future NIH efforts by suggesting that specific projects/clinical applications should be highlighted in the first few months of the grant, based on meetings among core members, so as to facilitate a focus on clinical applications from the outset.
Such an early emphasis on clinical applications/problems would also facilitate an early focus on the development and application of computational tools, which could be more closely aligned with specific clinical problems and applications. This would breakdown artificial barriers that a "core" focus involves, which, while seemingly an inherent part of the initial stages, could be curtailed by highlighting early the need to focus on specific needed applications. In this way, the needed applications of the driving biological problems could form natural groupings that involve members from all cores, and work groups could be set up from the beginning that reflect a "theme"/"application" approach. This would also assist in more communication between core members, which would also likely facilitate ongoing communication among computer scientists, engineers, and biomedical researchers.
- Developing Robust Software for Advanced Applications is Difficult
Creating industrial-strength software solutions to support scientific investigations is time consuming and requires skill not usually found in the academic environment. NA-MIC has been successful at bringing in commercial software development expertise to help accomplish the center's goals (GE, Kitware, Isomics) but these resources are routinely stretched well beyond the allocated budgets due to the many research directions of the center's scientists. When considering biomedical computing projects, the NIH must not attempt to short change the development process or the resulting systems run a greater risk of being difficult to maintain, difficult to scale up, and incompatible across systems.
An excellent example of the level of effort needed for a successful, extensible, cross-platform product is the National Library of Medicine's Insight Toolkit (ITK). The organizational meeting for ITK was held in October 1999. Between October 1999 and October 2002, fifty developers contributed code from six prime contractors (GE, Kitware, Insightful, UNC, Utah, and UPenn) and four sub-contractors (BWH, UPenn, Pitt, and Columbia) to produce ITK Version 1.0. To date, over $13.5 million has been awarded by the NIH for the development, use, and expansion of ITK. That total includes 20, one-year contracts that were given to early adopters of ITK. When assessing the level of effort expended on ITK, it is important to consider that ITK's funding did not cover algorithm, graphical user interface, or visualization developments. ITK costs only funded the integration of existing methods into a common library. Developing end applications, involving user interfaces and visualizations tailored for clinical users, requires significant additional effort.
- Algorithm Development Needs to be Interactive and Not Sequential
In reviewing the last three years of the driving biological problem, schizophrenia, it is evident that tool development that involved multiple interactions among members of Core 1 (Computer Scientists), 2 (Engineers), and 3 (Driving Biological Problem), at all stages of development, led to the development of computational tools that were both more tailored to the specific applications needed by Core 3 members, as well as to the development of tools that were more optimized for general use. This interactive mode of tool development is in contrast to tool development that proceeded more sequentially, where one or several members of Core 1 and 3 met, and then Core 1 proceeded with what their understanding was of the problem and went off and developed a tool with very little further input from Core 3 until the tool was delivered. The latter approach often resulted in delays in receiving the tool, as there was less communication between Core 1 and Core 3 members in these instances, and often the tool did not really meet the specific needs of the application without further work. In the future, and based on this experience, NIH initiatives should emphasize the importance of encouraging a more "interactive" approach to tool development and to discouraging what is termed here as a more "sequential" approach to tool development. With a more interactive approach, progress can be more readily evaluated at each phase of tool development, and input and testing can be provided based on more communication among members of Core 1, 2, and 3. A more "interactive" model is also far more responsive to the needs of the driving biological problem, and also keeps the focus on the clinical application.
Focusing on interactions across core members will also likely facilitate breaking down the steep learning curve inherent in early interactions across cores members (see above).
- Do Not Limit Driving Biological Problems to 3 Years
Given the problems of: (1) bringing new investigators together from diverse backgrounds, and (2)a "core" focus that detracts from focusing on the clinical problems, another problem is (3)limiting the time of the driving biological problem to three years. Even if the time table could be improved for getting investigators working together more quickly, and even if researchers across cores focused on specific clinical problems right from the start, limiting the driving biological problem to 3 years is not realistic. This is particularly the case given that it is only in the third year the application of tools to clinical problems really begins to take shape. This is also a time period when the driving biological problems are ready to reap not only the benefits of the new tools, but also a time when members representing the driving biological problem are ready to provide further feedback to computer scientists and engineers with respect to refining the tools so as to make them more suited to the task at hand, as well as making the new tools more user friendly for wider use. To end the driving biological problems at a time when the fruits of labor are just being reaped severely curtails the completion of the application of new tools to clinical problems. There is also less time to confirm and validate findings, so as to determine that the findings are not a reflection of a methodological confound introduced by the new tool.
SYLVAIN OTHER??
Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
(Will Schroeder, Allen Tannenbaum--THE PUBLICATION LIST NEEDS UPDATING. O'Donnell paper, for example was out there a while ago in AJNR--things missing from our group also)
A3:
- NA-MIC Publications are available here: http://www.na-mic.org/Wiki/index.php/Publications.
- NA-MIC Software Tools are available here: http://www.na-mic.org/Wiki/index.php/NA-MIC-Kit
The Center has created and extended a number of software tools to handle some of the key problems in medical imaging analysis and to deliver these computational technologies via a suite of applications, toolkits, and infrastructure facilities. A summary description of these tools includes:
- Slicer3 (application) - a application platform for deploying imaging technologies, newly architected with an execution model facilitating integration, work flow, and large scale computing. While Slicer3 is the newest addition to the NAMIC Kit, it is built on pre-existing, mature toolkits so that the application is relatively mature, and is already in use. Because Slicer3 supports plug-in modules, active development is proceeding to create and package various modules for dissemination to the NAMIC community.
- ITK (toolkit) - a mature system for image analysis, registration and segmentation (initially created in 1999). ITK is in use worldwide for medical imaging research and development.
- VTK (toolkit) - a mature system for visualization, graphics, volume rendering, and interaction (initially created in 1993). VTK is used worldwide for research, teaching and commercial development.
- DART (computational infrastructure) - a key component of the NAMIC quality control process, DART is used to coordinate and report the software testing process. It was created in the first year of NAMIC and is in constant use, therefore a mature system.
- CMake/CTest/CPack (computational infrastructure) - CMake and CTest are relatively mature systems used to manage the building and testing of software across diverse computer platforms. CMake is used worldwide by some of the world's largest open source systems such as KDE. CPack, a recent addition to the NAMIC kit, is used to simplify the packaging and dissemination of software across platforms. Thus in NAMIC we can easily deploy our software across Windows, Linux, Unix, and Mac platforms.
- Other tools (computational infrastructure) - Many other software tools are used to support the development of advanced imaging applications, and to assist with large scale computing, including
- Teem - image processing tools (mature)
- KWWidgets - Open source, cross platform GUI toolkit (mature, but development continues to support workflow).
- BatchMake - Support large-scale computing, including grid computing, for performing large population studies and statistical analysis (under active development).
- GridWizard - an application scheduler aimed at allowing researchers to easily harness the power of large computational grids. It lets you run tens of thousands of commands simultaneously on multiple clusters of computers by typing a single command, without writing scripts. It can be used by itself, and is currently being integrated with Slicer3 and as part of a web-based portal environment. (http://www.na-mic.org/Wiki/index.php/Slicer3:Grid_Interface)
These tools address key problems in imaging including segmentation, registration, visualization, and shape analysis; or provide facilities supporting researchers and developers who wish to create advanced software applications. One of key characteristics of NAMIC is that we treat the development of advanced medical image analysis software holistically; that is, the complete cycle of algorithm design, efficient impmentations, quality control and dissemination are needed to effectively address challenges provided by the driving biological problems. Examples of how these tools are being used include the following:
(a) Segmentation: Here there are a variety of tools of varying degrees of maturity ranging from the EM Segmentor (a widely distributed, mature algorithm included in Slicer3 as an application plug-in) to more recent work on DTI segmentation based on directional flows which are Matlab and C++ based. Powerful mature tools such as Bayesian segmentation have been recently included in Slicer (and have been available in ITK for some time now) which can be combined with very recent work on the semi-automated segmentation of the DPFC done in collaboration with Core 3 researchers. Further, tools previously developed by NAMIC researchers which have had a wide distribution have been put into Slicer. For example, geometric based segmentation methods (some of which were included in packages marketed by GE) were tailored for cortical segmentation, and included in the Slicer, and in fact even improved with the inclusion of statistical based approaches.
(b) Registration: Similar remarks can be made for registration in which we have a spectrum ranging from very mature methods to very recent ones which are still being tested. In particular, mature widely distributed methodologies for rigid registration are now included in ITK, as well as spline-based registration methodologies. These are well-tested methods which have been made accessible to the general imaging community. Newer methodologies such as those based on optimal transport for elastic registration are being included in ITK. NAMIC has also pushed for fast implementations of its algorithms to be used on cheap widely available platforms. Taking a cue from the game industry, some algorithms have been ported to GPUs (graphics cards) which are being employed now as computing devices. This has led to a speed-up of almost two orders of magnitude on some of the registration algorithms being tested.
(c) Shape Analysis: Again a number of methodologies have been developed and implemented with varying levels of maturity. Shape methodologies based on spherical harmonics are quite mature, and are available in pipelines developed by NAMIC researchers and have been distributed to the general community. A newer spherical based wavelet shape analysis package has been put into ITK, which also drives a novel shape-based segmentation procedure. More globally based spherical harmonic ideas have been combined with the multi-resolution spherical wavelet approach as a statistical shape based package for schizophrenia. This general technique may be used for other purposes as well, and is presently being ported to some work being done on the prostate. Work has also been accomplished on particle based approaches to this important problem area with the code put into ITK. Many times we work with a Matlab/C++ initial version of our codes, then move to ITK, and finally to Slicer. However, even at the Matlab/C++ stage, algorithms have been distributed and used in a clinical setting (for example, rule-based brain segmentation approaches).
(d) Diffusion Weighted Image (DWI) Analysis: A number of tools relevant to diffusion tensor estimation, fiber tractography and geometric and statistical analysis of fiber bundles have been contributed to the NAMIC toolkit. Some of these tools have been already integrated into diffusion dedicated ITK package with GUI as part of the NAMIC toolkit (e.g., FiberViewer-UNC) and into the Slicer platform (BWH tractography, clustering). The impact of NAMIC DWI analysis activities is best characterized by most recent journal articles and journal articles in print. The application of the NAMIC FiberViewer tool (UNC) in large clinical studies at UNC and Duke are in print (Taylor et al, Gilmore et al., Cascio et al.). Clinical application of the Slicer DTI package by BWH/MIT is reported in O’Donnell et al. Kuroki N. et al. and Nakamura et al.. The research by MGH is found in two journal publications by Tuch et al.. The description of the methodologies also appeared or will appear in peer reviewed journals (Corouge et al., Fletcher et al., Corouge et al.). New methods in development (Finsler metric GT, volumetric PDE based path analysis Utah, stochastic tractography BWH), and path of interest MGH, are currently tested on Core-3 DBP data, with the goal to give recommendations on which type of solution is appropriate to solve specific clinical analysis questions.
(e) Visualization: Core 2 researchers involved with NAMIC now (e.g., the founders of Kitware) were at the forefront of developing VTK (and of course ITK). Thus here we are considering technologies which are at a commercial level of development, and used at thousands of sites. Algorithms developed at NAMIC have driven new directions for these packages. Newer visualization methods, for example, the conformal flattening procedure have been ported to an ITK filter and is in the NAMIC Sandbox. Quasi-isometric methods for brain flattening from the MGH Free Surfer have become part of the NAMIC enterprise as well. These flattening procedures are very easy to use, and may also be employed for registration. Code for the control of distortion of the area in flattening has been incorporated which give area-preservation with minimal distortion. The techniques may be also used for several other purposes included automatic fly-throughs in endoscopy (incorporated into Slicer), and for texture mappings for general visualization purposes.
Logistics
When completed, the information should be sent to:
Gwen Jacobs, PhD
Professor of Neuroscience
Asst. CIO and Director of Academic Computing
1 Lewis Hall
Montana State University
Bozeman, MT 59717
406-994-7334 - phone
406-994-7077 - FAX
gwen@cns.montana.edu <mailto:gwen@cns.montana.edu>