2007 Materials for NCBC Program Review
Contents
- 1 Materials requested for NCBC Program Review
- 2 Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.=
- 3 Q2
- 3.1 Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
- 3.2 Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
- 3.3 Q2.3 Are the NCBCs interfacing appropriately? (recommended by RICC)
- 3.4 Q2.4. What new collaborations have been formed through the NCBC initiative?
- 3.5 Q2.5. What new training opportunities have the centers provided?
- 3.6 Q2.6. What changes could make the program more effective in the future?
- 3.7 Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
- 4 Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
- 5 Logistics
Materials requested for NCBC Program Review
These are due to Gwen Jacobs by Friday, June 08, 2007.
Q1: A copy of two parts of your most recent progress report: the summary section and the highlights section.=
(Tina/some edits from Marty --Tina--you had schizophrenia listed as a NEW DBP, I put in VCFS--if this was in the annual report, it needs to be fixed. Also, under highlights--- I see what Core 1 was able to do because of data provided by core 3, but there are no highlights of what core 3 did with what Core 1 gave them --- don't you think this should be fixed? That is, there are NO HIGHLIGHTS listed for CORE 3 --)
Summary: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#1._Introduction
The National Alliance for Medical Imaging Computing (NA-MIC) is now in its third year. This Center is comprised of a multi-institutional, interdisciplinary team of computer scientists, software engineers, and medical investigators who have come together to develop and apply computational tools for the analysis and visualization of medical imaging data. A further purpose of the Center is to provide infrastructure and environmental support for the development of computational algorithms and open source technologies, and to oversee the training and dissemination of these tools to the medical research community. The driving biological projects (DBPs) for the first three years of the Center came from schizophrenia, although the methods and tools developed are clearly applicable to many other diseases.
In the first year of this endeavor, our main focus was to develop alliances among the many cores to increase awareness of the kinds of tools needed for specific imaging applications. Our first annual report and all-hands meeting reflected this emphasis on cores, which was necessary to bring together members of an interdisciplinary team of scientists with such diverse expertise and interests. In the second year of the center our emphasis shifted from the integration of cores to the identification of themes that cut across cores and are driven by the requirements of the DBPs. We saw this shift as a natural evolution, given that the development and application of computational tools became more closely aligned with specific clinical applications. This change in emphasis was reflected in the Center's four main themes, which included Diffusion Tensor Analysis, Structural Analysis, Functional MRI Analysis, and the integration of newly developed tools into the NA-MIC Tool Kit. In the third year of the center, collaborative efforts have continued along each of these themes among computer scientists, clinical core counterparts, and engineering partners. We are thus quite pleased with the focus on themes, and we also note that our progress has not only continued but that more projects have come to fruition with respect to publications and presentations from NA-MIC investigators, which are listed on our publications page.
Below, in the next section (Section 2) we summarize our progress over the last year using the same four central themes to organize the progress report. These four themes include: Diffusion Image analysis (Section 2.1), Structural analysis (Section 2.2), Functional MRI analysis (Section 2.3), and the NA-MIC toolkit (Section 2.4). Section 3 highlights four important accomplishments of the third year: advanced algorithm development in Shape and DTI analysis, the newly architected open source application platform, Slicer 3, and our outreach and technology transfer efforts. Section 4 summarizes the impact and value of our work to the biocomputing community at three different levels: within the center, within the NIH-funded research community, and externally to a national and international community. The final section of this report, Section 5, provides a timeline of Center activities.
In addition, the end of the first three years of the center marks a transition from the first set of DBPs that were focused entirely on Schizophrenia to a new set that span a wider range of biological problems. The new DBPs include a focus on Systemic Lupus Erythematosis (MIND Institute, University of New Mexico), Velocardiofacial Syndrome (Harvard), and Autism (University of North Carolina, Chapel Hill), along with a direction that is new but synergistic for NA-MIC: Prostate Interventions (Johns Hopkins University). Funding for the second round of DBPs starts in the next cycle, but the PIs were able to attend the recent All-hands meeting and start developing plans for their future research in NA-MIC.
Finally, we note that Core 3.1 (Shenton and Saykin), are in the process of applying for a Collaborative R01 to expand current research with NA-MIC, which ends on July 31, 2007. Both Drs. Shenton and Saykin have worked for three years in driving tool development for shape measures, DTI tools, and path analysis measures for fMRI as part of the driving biological project in NA-MIC, and they now plan to expand this research in a Collaborative R01 by working closely with Drs. Westin, Miller, Pieper, and Wells, to design, assess, implement, and apply tools that will enable the integration of MRI, DTI, and fMRI in individual subjects, as well as to develop an atlas of functional networks and circuits that are based on a DTI atlas (i.e., structural connectivity), which will be integrated with a network of functional connectivity that will be identified from fMRI probes of attention, memory, emotion, and semantic processing. We mention this here because this will be, to our knowledge, the first DBP to apply for further funding to continue critical work begun with NA-MIC.
Highlights: http://www.na-mic.org/Wiki/index.php/2007_Annual_Scientific_Report#3._Highlights
The third year of the NA-MIC project saw continued development and dissemination of medical image analysis software. The current progress is clearly characterized by a significant increase in the application of Core-1 and Core-2 tools to image data provided by the Core-3 DBP groups. The main reasons for this progress is two-fold, first there are several new methods that are out of the prototype stage and ready to be applied to large sets of imaging datasets, and second there are new high-resolution imaging data from high-field scanners that are more appropriate for the new tools than historical data with often very coarse slice resolution.
With the release of the first version of Slicer3, the transfer of this technology is accelerating. Because of NA-MIC's strong ties with several large open source communities, such as ITK, VTK, and CMake, NA-MIC continues to make significant impact on the nation's broader biocomputing infrastructure. The following are just a few of the many highlights from the third year of the NAMIC effort.
- Advanced Algorithms
Core 1 continues to lead the biomedical community in DTI and shape analysis.
- NA-MIC published an open source framework for shape analysis, including providing access to the open source software repository. Shape analysis has become of increasing relevance to the neuroimaging community due to its potential to precisely locate morphological changes between healthy and pathological structures. The software has been downloaded many times since the first online publication in October 2006, and is now used by several prestigious image analysis groups.
- The spherical based wavelet shape analysis package has been contributed into ITK, and in the next few months the multiscale segmentation work will be incorporated as well.
- The NA-MIC community has implemented a very fast method for the optimal transport approach to elastic image registration which is currently being added to ITK.
- The NA-MIC toolkit includes a comprehensive set of modules for analysis of diffusion weighted images (DWI), including improved calculation of tensors, interpolation, nonlinear deformation and statistics on tensor fields, novel methods for tractography and for optimal path finding, and clustering of sets of streamlines.
- A quantitative tractography package for user-guided geometric parametrization and statistical analysis of fiber bundles (FiberViewer) has been contributed to the NAMIC toolbox. This tool used in several ongoing clinical DTI studies.
- The conformal flattening algorithm has been implemented as an ITK filter and is in the NA-MIC Sandbox in preparation for formal acceptance into the NA-MIC Kit.
- Technology Deployment Platform: Slicer3
Core 2 in conjunction with Algorithms (Core 1) and DBP (Core 3) are creating new tools to accelerate the transition of technology to the biomedical imaging community.
- One of the year's major achievements was the release of the first viable version of Slicer3 application, which evolved from concept to a full-featured application. The second beta version of Slicer3 was released in April 2007. The application provides a full range of functionality for loading, viewing, editing, and saving models, volumes, transforms, fiducials and other common medical data types. Slicer3 also includes a powerful execution model that enables Core 1 developers (and other in the NA-MIC community) to easily deploy algorithms to Core 2 and other biocomputing clients.
- Slicer3's execution model supports plug-in modules. These modules can be run stand alone or integrated into the Slicer3 framework. When integrated, the GUI to the module can be automatically generated from an associated XML file describing input parameters to the module. A variety of modules were created, ranging from simple image processing algorithms, to complex, multi-step segmentation procedures.
- To stress test Slicer3's architecture and demonstrate its capabilities, the EM Segment module (http://wiki.na-mic.org/Wiki/index.php/Slicer3:EM) was created and added to Slicer's library of modules. EM Segment is an automatic segmentation algorithm for medical images and represents a collaborative effort between the NAMIC engineering, algorithms, and biological problem cores. The EM Segment module enables users to quickly configure the algorithm to a variety of imaging protocols as well as anatomical structures through a wizard-style, workflow interface. The workflow tools have been integrated into the NA-MIC Kit, and are now available to all other modules built on the Slicer3 framework.
- Outreach and Technology Transfer
Cores 4-5-6 continue to support, train and disseminate to the NA-MIC community, and the broader biomedical computing community.
- NA-MIC continues to practice the best of collaborative science through its bi-annual Project Week events. These events, which gather key representatives from Cores 1-7 and external collaborators, are organized to gather experts from a variety of domains to address current research problems. This year's first Project Week was held in January and hosted by the University of Utah. It saw several significant accomplishments including the first beta release of the next generation Slicer3 computing platform. The second Project Week is scheduled for June in Boston, MA.
- Twelve NA-MIC-supported papers were published in high-quality peer reviewed conference proceedings (four papers in MICCAI alone). Another paper on the NAMIC software process was published in IEEE Software. All three DTI papers presented at MICCAI last year were NAMIC associated.
- Several workshops through the year were held at various institutions. This includes the DTI workshop at UNC, the MICCAI Open Source Workshop, and the NA-MIC Training Workshop at the Harvard Center for Neurodegeneration and Repair. The training and dissemination will continue with a DTI workshop at the forthcoming Human Brain Mapping meeting and activities at MICCAI 2007, among others.
Q2
A brief statement - (one page per question, max) addressing each of the questions listed below. These are the questions that we have been asked to address in our report. Our goal in asking for this information is to be able to produce a report that reviews the program as a whole. Your view, from the vantage point of the center you direct, is critical to our work. In addition, your answers will provide us with more information that we can use in our discussion with program staff on June 11th. We know that some of this information can be found on your websites, so in those cases a link to the information would be most helpful.
Q2.1 To what extent does the vision and direction of the NCBC initiative promote biomedical computing?
(Eric/Polina)
(From Ross/edits Marty--THIS SECTION I THINK NEEDS MORE WORK AS I DON"T UNDERSTAND HOW IT ANSWERS THE QUESTION BEING ASKED) The fields of biological and medical imaging are exploding. Moreover, the combination of new acquisition and reconstruction techniques, new computing resources, and diverse biological and clinical applications have resulted in a massive proliferation of imaging data. This imaging data has the potential for having an important impact on basic biological science, in the development of new drugs and medical technologies, and in direct clinical practice - in both diagnosis and treatment. Thus, biomedical imaging analysis is one of the most important applications of biomedical computing.
To realize this potential will require new computational tools for image analysis. These tools will rely on a diverse set of technical knowledge including physics, systems, computer science, mathmatics, and statistics. The development of these tools will also incorporate in-depth knowledge of clinical and biological applications in diverse areas such as neuroscience, psychiatry, encology, cardiology, and biochemistry.
These computational tools must be scalable in several ways. First, they must be computationally scalable. Tools for image analysis must be suitable for quantative analyses on large sets of 3D data, and many current software packages for image processing are not appropriate for this. In order to serve computational needs across NIH, image analysis tools must also scale across application domains and address a variety of clinical and biomedical problems. In the spirit of the National Center, these tools must also scale across institutions and research groups; that is, they must be usable and sustainable outside the context of the Center itself.
I DONT REALLY UNDERSTAND WHAT POINT IS BEING MADE IN THE PARAGRAPH BELOW??? (MARTY) The NAMIC NCBC addresses these issues through a variety of mechanisms. First it is truly an national center, with researchers from seven (?) institutions across the US that represent expertise in the wide range of disciplines described above. The distributed nature of the project provides technical ald clinical expertise, but it also enforces openness of the resource - the infrastructure must accessible in order for the center to operate effectively. NAMIC also includes a set of industrial partners in the engineering core (Core 2), who provide a set of scalable tools for developing, maintaining, and distributing software. The software is universally managable and maintainable, and it does not belong to any one research group --- it belongs to the community. The tools area also scalable across appliation domains. HUH??? While the initial DBPs focused on neuroscience, the new DBPs include oncology and associated R01s include biomechanics/orthopedics. Users of the software are even more diverse, and even include fields outside of medicine and biology. The strategy of NAMIC is to make sure that the effort scales over time as well. The liscensing agreement which is enforced on all of our development activities does not restrict use, and thus we anticipate (and promote) commerical use of the NAMIC software so that the impact of the project could be realized beyond the lifetime of the center.
Q2.2 In what ways has the NCBC initiative advanced biomedical computing?
(Tina)
In NA-MIC's third year, it is evident that NA-MIC is developing a culture, environment, and resources to foster and incite collaborative research in medical image analysis that draws together mathematicians, computer scientists, software engineers, and clinical researchers. These artefacts of NA-MIC impact how NA-MIC operates, make NA-MIC a fulcrum for NIH funded research, and draws new collaborators from across the country and around the world to NA-MIC.
- Impact within the Center
- Within the center, the NA-MIC organization, NA-MIC processes, and the NA-MIC calendar has permeated the research. The organization is nimble, forming ad hoc distributed teams within and between cores to address specific biocomputing tasks. Information is shared freely on the NA-MIC Wiki, on the weekly Engineering telephone conferences, and in the NA-MIC Subversion source code repository. The software engineering tools of CMake, Dart 2 and CTest, CPack, and KWWidgets facilitate a cross platform software environment for medical image analysis that be easily built, tested, and distributed to end-users. Core 2 has provided a platform, Slicer 3, that allows Core 1 to easily integrate new technology and deliver this technology in an end user application to Core 3. Core 1 has developed a host of techniques to apply to structural and diffusion analysis which are under evaluation by Core 3. Major NA-MIC events, such as the annual All Hands Meeting, the Summer Project Week, the Spring Algorithms meeting, and Engineering Teleconferences are avidly attended by NA-MIC researchers as opportunities to foster collaborations.
- Impact within NIH Funded Research
- Within NIH funded research, NA-MIC continues to forge relationships with other large NIH funded projects such as BIRN, caBIG, NAC, and IGT. Here, we are sharing the NA-MIC culture, engineering practices, and tools. BIRN hosts data for the NA-MIC researchers and NA-MIC hosts BIRN wikis. caBIG lists the 3D Slicer among the applications available on the National Cancer Imaging Archive. NAC and IGT use the NA-MIC infrastructure and are involved in the development of the 3D Slicer. BIRN recently held an event modeled after the NA-MIC Project Week. NA-MIC has become a resource on open source licensing to the medical image analysis community.
NA-MIC is also attracting NIH funded collaborations. Two grants have been funded under PAR-05-063 to collaborate with NA-MIC: Automated FE Mesh Development and Measuring Alcohol and Stress Interactions with Structural and Perfusion MRI. Five additional applications to collaborate with NA-MIC via the NCBC collaborative grant mechanism are under consideration. Additional grant applications submitted under other calls are planning to use and extend the NA-MIC tools.
- National and International Impact
- NA-MIC events and tools garner national and international interest. There were nearly 100 participants at the NA-MIC All Hands Meeting in January 2007, with many of these participants from outside of NA-MIC. Several researchers from outside the NA-MIC community have attended the Summer Project Weeks and the Winter Project Half-Weeks to gain access to the NA-MIC tools and people. These external researchers are contributing ideas and technology back into NA-MIC.
- Components of the NA-MIC kit are used globally. The software engineering tools of CMake, Dart 2 and CTest are used by many open source projects and commercial applications. For example, the K Desktop Environment (KDE) for Linux and Unix workstations uses CMake and Dart. KDE is one of the largest open source projects in the world. Many open source projects and commercial products are benefiting from the NA-MIC related contributions to ITK and VTK. Finally, Slicer 3 is being used as an image analysis platform in several fields outside of medical image analysis, in particular, biological image analysis, astronomy, and industrial inspection.
- NA-MIC co-sponsored the Workshop on Open Science at the Medical Image Computing and Computer-Assisted Intervention (MICCAI) 2006 conference. The proceedings of the workshop are published on the electronic Insight Journal, another NIH-funded activity.
- Over 50 NA-MIC related publications have been produced since the inception of the center.
Q2.3 Are the NCBCs interfacing appropriately? (recommended by RICC)
(Will Schroeder)
Q2.4. What new collaborations have been formed through the NCBC initiative?
(Jim Miller)
NA-MIC’s structure and organization has facilitated many new collaborations. NA-MIC is a distributed center, bringing together mathematicians, computer scientists, software engineers, and clinicians from multiple sites. This distributed structure provided two types of new collaborations within NA-MIC: new collaborations between cores and new collaborations within cores. For between core collaborations, many of the algorithm and engineering core researchers had not collaborated previously with the researchers in either the first or second round of Driving Biological Projects (DBPs). Thus, the NCBC provided a unique opportunity for the algorithm and engineering core researchers to gain clinical insight and to adapt and tune their algorithms and tools to new clinical contexts. Conversely, the DBPs gained access to algorithms and tools that they previously had not utilized. Similarly, many of the algorithm core researchers and engineering core researchers had not previously collaborated. Thus, the NCBC exposed the researchers in the algorithm core to the tools and engineering practices of the engineering core and exposed the researchers in the engineering core to the computational techniques and data structures utilized by the algorithms core. For within core collaborations, many of the researchers within the algorithm core had not previously collaborated. Through NA-MIC, these researchers have been able to cooperate and also amicably compete to address the issues brought forth by the DBPs.
Below is a list of new collaborations within NA-MIC:
- Georgia Tech + UC Irvine – Rule based segmentation algorithm for DLPFC
- Georgia Tech + Kitware - Knowledge-based Bayesian classification and segmentation
- BWH + MIT + Kitware - Brain tissue classification and subparcellation of brain structures
- Dartmouth + UNC + BWH - Shape analysis of tbe hippocampus
- Utah + BWH - Automated shape model construction
- Dartmouth + Isomics - Neural substrates of apathy in schizophrenia
- <others to be filled in>
NA-MIC has also attracted researchers from the field who were not originally part of NA-MIC. Some of these new collaborations are formally organized using the NIH NCBC Collaborative R01 program. But other collaborations are being driven solely by the opportunity to share resources, techniques, capabilities, and ideas.
Below is a list of new collaborations between external researchers and NA-MIC:
- Mario Negri + GE Research – Integration of vmtk with Slicer 3
- <others to be filled in>
The collaborative nature of NA-MIC is exemplified by the attendance at the NA-MIC All Hands Meeting and the NA-MIC Summer Project Week. Researchers from within and external to NA-MIC come together at these two events to forge collaborations. At the 2007 NA-MIC All Hands Meeting alone, there were 96 attendees: 56 NA-MIC researchers, 32 NA-MIC collaborators from 13 institutions, and 8 members of the External Advisory Board and NIH. At the Project Half-Week run in conjunction with the All Hands Meeting, there were 38 projects: 16 initiated from the algorithm core, 10 specific to the engineering core, and 11 from external collaborators.
More detailed information on collaborations as well as Project Week events can be found at:
- http://www.na-mic.org/Wiki/index.php/NA-MIC_Collaborations
- http://www.na-mic.org/Wiki/index.php/Engineering:Programming_Events
Q2.5. What new training opportunities have the centers provided?
(Randy Gollub)
- The unique perspective of providing training opportunities and resources that are specifically targeted to a multi-disciplinary audience of basic and clinical biomedical scientists, computer scientists and medical imaging scientists is fostered by the NCBC Program. Within NA-MIC, this perspective has given rise to a thriving training program that supports the biomedical research community within NAMIC, across the NIH community and around the world. The strong demand for our training resources is evident from the large number of hits to our training web pages, from the rapid enrollment in all offered workshops, and the positive feedback from participants. We believe, and our belief is supported by the documented backgrounds of our workshop attendees, that a key aspect of our training materials that makes them useful to the community is that they are learner-centered, goal-oriented, and targeted to bridge the gaps in technical knowledge and language that exist between basic and clinical biomedical scientists, computer scientists and medical imaging scientists. For example, a tutorial that teaches how to use Slicer to register two images includes not only the necessary details of how to implement the algorithm, but also the conceptual framework for the registration approach, the mathematical underpinnings of the algorithm and a detailed anatomical approach for visually inspecting and refining the registration. This rich, but simple approach provides a consistently educational experience for every new user of the NAMIC toolkit.
- NAMIC supported new training opportunities are developed to maximize impact on the wider scientific community. The primary vehicle for this is "Slicer 101", our portfolio of Slicer training tutorials (http://www.na-mic.org/Wiki/index.php/Slicer:Workshops:User_Training_101). We have focused our efforts on making all our tutorial materials available via the NA-MIC Wiki as downloadable Powerpoint presentations and accompanying curated, anonymized datasets. The tutorials are all carefully tested on multiple computer platforms and by our team before being used in live Workshops (http://www.na-mic.org/Wiki/index.php/Training:Events_Timeline). Refinements are made based on the feedback of the audience and our experience during the teaching sessions. The final product of our work allows any new users, regardless of educational background, to not only use the NAMIC tools and algorithms, but to understand what they are doing and why. To date we have had over 7,880 hits to the Slicer 101 webpage.
- NAMIC supported Workshops are another unique venue for multi-disciplinary training. In addition to the all the points made regarding the content of the training materials, the 14 Workshops run by the NAMIC Training core over the past 3 years have each provided the opportunity for new connections to be made among basic and clinical biomedical scientists, computer scientists and medical imaging scientists. All our Workshops provide opportunities for formal and informal discussions among attendees of diverse backgrounds and strengths. These hands-on, interactive workshops allow participants to translate concepts of medical image processing into skills through instructor-led training. The simplicity of our approach, and the exceptional quality of the NAMIC toolkit, ensures a very high success rate for knowledge and skill acquisition. We estimate that 370 people from 52 different universities and companies attended our Workshops between 2005 and 2006.
- We are currently focusing our efforts on reaching a wider community by delivering a more didatic based Workshop in conjunction with the upcoming Organization for Human Brain Mapping meeting in Chicago next week. We held our enrollment to 50 so that we could offer the same hands-on interactive training experience to the attendees and our registration filled within a few weeks of the offering being posted. Tenatively we anticipate that (stats on the divese geography of registrants to come here).
Q2.6. What changes could make the program more effective in the future?
(Steve Pieper)
- National Visibility
The overall objectives of the NCBC program calls for building computational infrastructure for the nation's biomedical research efforts -- this ambitious goal includes both computational and biomedical science, areas in which the existing centers have extensive experience, but also in nationwide "marketing" efforts where the scientific community is less adept. Several centers, including NA-MIC, have developed novel approaches to this problem through their training and dissemination cores, but as the output of the centers grows along with the number of potential users in the community and the potential impact, these critical resources will be increasingly strained. There are several approaches the overall program could take to address this issue including supplemental funding to host workshops or conferences, providing small travel grants for researchers or students to visit the centers, or actively encouraging a wider range of NIH funded researchers to adopt the tools generated by the NCBCs. In particular, collaboration PAR has been very effective as a mechanism for encouraging busy scientists to consider adopting the NCBC tools and is an excellent example of how the program can extend the impact of the basic investment in scientific infrastructure.
- Local Autonomy
The program should avoid adding extra layers of uniformity to what are fundamentally unique centers. The NCBC program has successfully established a distributed network of centers drawing on the expertise of some of the nation's leading researchers drawn to the program for the opportunity to develop and apply their know-how to this ambitious effort. This rich environment, predictably, yields a diversity of approaches and organizational structures as each of the centers works to implement their particular vision of how to fulfill the overall mission of the NCBC program. Preserving the vitality of the effort depends on retaining this autonomy as each center strives to meet the individual objectives suited to their communities. The program needs well defined goals that each center must meet, but the overall program should facilitate the individual solutions of the center's leadership.
Q2.7. What lessons have been learned from the NCBC initiative that can guide future NIH efforts in biomedical computing?
(Martha Shenton) COMMENT from Randy- I recommend that folks from Cores 1 & 2 also contribute here to give a more balanced response. All of Marty's points are well taken, but I wonder if there are other points also to be made here that will be valuable.
- Greater Start Up Time and Steeper Learning Curve Than Anticipated
Bringing together computer scientists, engineers, and biomedical researchers, with diverse interests, training, and background, for the purpose of working on a set of biological problems, is no easy feat and, in retrospect required a steeper learning curve than was anticipated. This steeper learning curve is understandable, since the main focus initially was on developing alliances among the cores in order to increase awareness about the kinds of tools needed for the specific imaging problems posed by the biomedical researchers who were driving the biological problems. The first year of the grant, as noted in our annual report, thus reflected a "core" emphasis, as an interdisciplinary team was brought together, many members for the first time. It was not until the second year of the grant that the focus on a "core" emphasis shifted to a focus on "themes", which cut across "core" boundaries. While this shift was viewed as part of a natural evolution, now that we know this, we can help guide future NIH efforts by suggesting that specific projects/clinical applications should be highlighted in the first few months of the grant, based on meetings among core members, so as to facilitate a focus on clinical applications from the outset.
Such an early emphasis on clinical applications/problems would also facilitate an early focus on the development and application of computational tools, which could be more closely aligned with specific clinical problems and applications. This would breakdown artifical barriers that a "core" focus involves, which, while seemingly an inherent part of the initial stages, could be curtailed by highlighting early the need to focus on specific needed applications. In this way, the needed applications of the driving biological problems could form natural groupings that involve members from all cores, and work groups could be set up from the beginning that reflect a "theme"/"application" approach. This would also assist in more communication between core members, which would also likely facilitate ongoing communication among computer scientists, engineers, and biomedical researchers.
- Algorithm Development Needs to be Interactive and Not Sequential
In reviewing the last three years of the driving biological problem, schizophrenia, it is evident that tool development that involved multiple interactions among members of Core 1 (Computer Scientists), 2 (Engineers), and 3 (Driving Biological Problem), at all stages of development, led to the development of computational tools that were both more tailored to the specific applications needed by Core 3 members, as well as to the development of tools that were more optimized for general use. This interactive mode of tool development is in contrast to tool development that proceeded more sequentially, where one or several members of Core 1 and 3 met, and then Core 1 proceeded with what their understanding was of the problem and went off and developed a tool with very little further input from Core 3 until the tool was delivered. The latter approach often resulted in delays in receiving the tool, as there was less communication between Core 1 and Core 3 members in these instances, and often the tool did not really meet the specific needs of the application without further work. In the future, and based on this experience, NIH initiatives should emphasize the importance of encouraging a more "interactive" approach to tool development and to discouraging what is termed here as a more "sequential" approach to tool development. With a more interactive approach, progress can be more readily evaluated at each phase of tool development, and input and testing can be provided based on more communication among members of Core 1, 2, and 3. A more "interactive" model is also far more responsive to the needs of the driving biological problem, and also keeps the focus on the clinical application.
Focusing on interactions across core members will also likely facilitate breaking down the steep learning curve inherent in early interactions across cores members (see above).
- Do Not Limit Driving Biological Problems to 3 Years
Given the problems of: (1) bringing new investigators together from diverse backgrounds, and (2)a "core" focus that detracts from focusing on the clinical problems, another problem is (3)limiting the time of the driving biological problem to three years. Even if the time table could be improved for getting investigators working together more quickly, and even if researchers across cores focused on specific clinical problems right from the start, limiting the driving biological problem to 3 years is not realistic. This is particularly the case given that it is only in the third year the application of tools to clinical problems really begins to take shape. This is also a time period when the driving biological problems are ready to reap not only the benefits of the new tools, but also a time when members representing the driving biological problem are ready to provide further feedback to computer scientists and engineers with respect to refining the tools so as to make them more suited to the task at hand, as well as making the new tools more user friendly for wider use. To end the driving biological problems at a time when the fruits of labor are just being reaped severely curtails the completion of the application of new tools to clinical problems. There is also less time to confirm and validate findings, so as to determine that the findings are not a reflection of a methodological confound introduced by the new tool.
SYLVAIN OTHER??
Q3:A list of publications and/or software tools produced by the Center. If this information is provided in your progress report or is available on your website, a link will be sufficient. We are especially interested in your assessment of the maturity of your software tools and the impact they are having on the scientific community.=
(Will Schroeder, Allen Tannenbaum--THE PUBLICATION LIST NEEDS UPDATING. O'Donnell paper, for example was out there a while ago in AJNR--things missing from our group also)
A3:
- NA-MIC Publications are available here: http://www.na-mic.org/Wiki/index.php/Publications.
- NA-MIC Software Tools are available here: http://www.na-mic.org/Wiki/index.php/NA-MIC-Kit
The Center has created and extended a number of software tools to handle some of the key problems in medical imaging analysis and to deliver these computational technologies via a suite of applications, toolkits, and infrastructure facilities. A summary description of these tools includes:
- Slicer3 (application) - a application platform for deploying imaging technologies, newly architected with an execution model facilitating integration, work flow, and large scale computing. While Slicer3 is the newest addition to the NAMIC Kit, it is built on pre-existing, mature toolkits so that the application is relatively mature, and is already in use. Because Slicer3 supports plug-in modules, active development is proceeding to create and package various modules for dissemination to the NAMIC community.
- ITK (toolkit) - a mature system for image analysis, registration and segmentation (initially created in 1999). ITK is in use worldwide for medical imaging research and development.
- VTK (toolkit) - a mature system for visualization, graphics, volume rendering, and interaction (initially created in 1993). VTK is used worldwide for research, teaching and commercial development.
- DART (computational infrastructure) - a key component of the NAMIC quality control process, DART is used to coordinate and report the software testing process. It was created in the first year of NAMIC and is in constant use, therefore a mature system.
- CMake/CTest/CPack (computational infrastructure) - CMake and CTest are relatively mature systems used to manage the building and testing of software across diverse computer platforms. CMake is used worldwide by some of the world's largest open source systems such as KDE. CPack, a recent addition to the NAMIC kit, is used to simplify the packaging and dissemination of software across platforms. Thus in NAMIC we can easily deploy our software across Windows, Linux, Unix, and Mac platforms.
- Other tools (computational infrastructure) - Many other software tools are used to support the development of advanced imaging applications, and to assist with large scale computing, including
- Teem - image processing tools (mature)
- KWWidgets - Open source, cross platform GUI toolkit (mature, but development continues to support workflow).
- BatchMake - Support large-scale computing, including grid computing, for performing large population studies and statistical analysis (under active development).
These tools address key problems in imaging including segmentation, registration, visualization, and shape analysis; or provide facilities supporting researchers and developers who wish to create advanced software applications. One of key characteristics of NAMIC is that we treat the development of advanced medical image analysis software holistically; that is, the complete cycle of algorithm design, efficient impmentations, quality control and dissemination are needed to effectively address challenges provided by the driving biological problems. Examples of how these tools are being used include the following:
(a) Segmentation: Here there are a variety of tools of varying degrees of maturity ranging from the EM Segmentor (a widely distributed, mature algorithm included in Slicer3 as an application plug-in) to more recent work on DTI segmentation based on directional flows which are Matlab and C++ based. Powerful mature tools such as Bayesian segmentation have been recently included in Slicer (and have been available in ITK for some time now) which can be combined with very recent work on the semi-automated segmentation of the DPFC done in collaboration with Core 3 researchers. Further, tools previously developed by NAMIC researchers which have had a wide distribution have been put into Slicer. For example, geometric based segmentation methods (some of which were included in packages marketed by GE) were tailored for cortical segmentation, and included in the Slicer, and in fact even improved with the inclusion of statistical based approaches.
(b) Registration: Similar remarks can be made for registration in which we have a spectrum ranging from very mature methods to very recent ones which are still being tested. In particular, mature widely distributed methodologies for rigid registration are now included in ITK, as well as spline-based registration methodologies. These are well-tested methods which have been made accessible to the general imaging community. Newer methodologies such as those based on optimal transport for elastic registration are being included in ITK. NAMIC has also pushed for fast implementations of its algorithms to be used on cheap widely available platforms. Taking a cue from the game industry, some algorithms have been ported to GPUs (graphics cards) which are being employed now as computing devices. This has led to a speed-up of almost two orders of magnitude on some of the registration algorithms being tested.
(c) Shape Analysis: Again a number of methodologies have been developed and implemented with varying levels of maturity. Shape methodologies based on spherical harmonics are quite mature, and are available in pipelines developed by NAMIC researchers and have been distributed to the general community. A newer spherical based wavelet shape analysis package has been put into ITK, which also drives a novel shape-based segmentation procedure. More globally based spherical harmonic ideas have been combined with the multi-resolution spherical wavelet approach as a statistical shape based package for schizophrenia. This general technique may be used for other purposes as well, and is presently being ported to some work being done on the prostate. Work has also been accomplished on particle based approaches to this important problem area with the code put into ITK. Many times we work with a Matlab/C++ initial version of our codes, then move to ITK, and finally to Slicer. However, even at the Matlab/C++ stage, algorithms have been distributed and used in a clinical setting (for example, rule-based brain segmentation approaches).
(d) Diffusion Weighted Image (DWI) Analysis: A number of tools relevant to diffusion tensor estimation, fiber tractography and geometric and statistical analysis of fiber bundles have been contributed to the NAMIC toolkit. Some of these tools have been already integrated into diffusion dedicated ITK package with GUI as part of the NAMIC toolkit (e.g., FiberViewer-UNC) and into the Slicer platform (BWH tractography, clustering). The impact of NAMIC DWI analysis activities is best characterized by most recent journal articles and journal articles in print. The application of the NAMIC FiberViewer tool (UNC) in large clinical studies at UNC and Duke are in print (Taylor et al, Gilmore et al., Cascio et al.). Clinical application of the Slicer DTI package by BWH/MIT is reported in O’Donnell et al. Kuroki N. et al. and Nakamura et al.. The research by MGH is found in two journal publications by Tuch et al.. The description of the methodologies also appeared or will appear in peer reviewed journals (Corouge et al., Fletcher et al., Corouge et al.). New methods in development (Finsler metric GT, volumetric PDE based path analysis Utah, stochastic tractography BWH), and path of interest MGH, are currently tested on Core-3 DBP data, with the goal to give recommendations on which type of solution is appropriate to solve specific clinical analysis questions.
(e) Visualization: Core 2 researchers involved with NAMIC now (e.g., the founders of Kitware) were at the forefront of developing VTK (and of course ITK). Thus here we are considering technologies which are at a commercial level of development, and used at thousands of sites. Algorithms developed at NAMIC have driven new directions for these packages. Newer visualization methods, for example, the conformal flattening procedure have been ported to an ITK filter and is in the NAMIC Sandbox. Quasi-isometric methods for brain flattening from the MGH Free Surfer have become part of the NAMIC enterprise as well. These flattening procedures are very easy to use, and may also be employed for registration. Code for the control of distortion of the area in flattening has been incorporated which give area-preservation with minimal distortion. The techniques may be also used for several other purposes included automatic fly-throughs in endoscopy (incorporated into Slicer), and for texture mappings for general visualization purposes.
Logistics
When completed, the information should be sent to:
Gwen Jacobs, PhD
Professor of Neuroscience
Asst. CIO and Director of Academic Computing
1 Lewis Hall
Montana State University
Bozeman, MT 59717
406-994-7334 - phone
406-994-7077 - FAX
gwen@cns.montana.edu <mailto:gwen@cns.montana.edu>