Dissemination:Jan13

From NAMIC Wiki
Jump to: navigation, search
Home < Dissemination:Jan13
  1. We followed approximately the following agenda:
    1. Thursday:
    2. 9-10am Demo and discussion of DTI Fiber generation and viewing tool with graduate student Clement Vachet
    3. 10-11am Demo and discussion of DTI fiber analysis with postdoc fellow Isabelle Corouge, who will be attending the SLC AHM.
    4. 11-noon Continued the discussion with Guido about DTI fiber generation and how these tools can be used within NAMIC.
    5. 1:30-4:30pm Discussion of ITK activity at UNC with Stephen Aylward and Julien Jomier, neither of whom are directly involved with NAMIC but could make a significant impact to engineering side of the project. Julien has agreed to attend the SLC AHM and participate in the engineering infrastructure discussions.
    6. 4:30-5pm Demo and discussion of neonate image analysis with graduate student Sampat Vetsa.
    7. 5-5:30pm Wrap-up for the day with Guido and Martin.
    8. Friday:
    9. 10-10:45 Demo and discussion of shape analysis of corpus callosum in autism with graduate student Ipek Oguz. The plan is to adapt these types of shape analysis techniques to 3d brain structures for schizophrenia data for NAMIC in year 1.
    10. 11-noon: Demo and discussion with Guido of dataflow management tools used for processing large volumes of clinical data. Also demo of visualization/segmentation tool Snap.
    11. 1-1:30pm: Demo and discussion of comparison and validation of different representations for shape analysis with graduate student Christine Xu.
    12. 1:30-2:30pm: Wrap up of visit with Guido and Martin.
  2. Most of the UNC software tools are written using ITK (with QT or TCL/TK used for the user interface), and we discussed at length about what it would take to make the underlying modules be easily available from within Slicer. There was agreement that a focus of NAMIC software development would be to ensure compatibility with the Slicer, both in its current version, and also as the architecture of Slicer is modified over the course of NAMIC. The question came up again on what is the best mechanism to integrate tools back into the Slicer. Should we have a centralized Slicer expert who will take tools that have been written to a “Slicer compatible” spec, and integrated them back, or do we need an engineer at each site who will be doing this work. A worthy topic of discussion at the AHM.
  3. UNC has a mature set of software tools for DTI image visualization and analysis in particular. Stephen Aylward’s group, one is one of the key players in the development of ITK, has developed a library, SpatialObjects , (part of itkUNC at this point, not the main distribution) that could be useful for the representation and visualization of DTI data in Slicer and other applications. During the visit, Julien took the initiative to download Slicer and start looking at what it would take to be able to use SpatialObjects from within it.
  4. The need for a DTI working group was reinforced at the meeting. There is clearly a good understanding of what is needed for DTI infrastructure in the “NAMIC toolkit” between the various groups of NAMIC, and the sooner the group gets together to define the specs for this engineering infrastructure, the more likely everyone is to actually use it. Martin took a shot at this discussion on the Wiki, and we will continue in that spirit over the next few weeks leading upto the SLC AHM.
  5. There is an impressive process in place for segmenting neonatal MRI data. It takes about 20 minutes per scan, performs pretty much every known registration and segmentation algorithm in the process, and results in demonstrably good segmentation results. The data in this example is not related to NAMIC, but the dataflow processing is a good example of what NAMIC will need as we start processing large volumes of data for our DBPs. It would be worthwhile to use this data flow application as an example as we refine the specifications for the NAMIC LONI pipeline.
  6. DTI Protocol Question: The DTI data currently being processed by the UNC group is from a clinical protocol which has about a 12 minute acquisition time for a sequence of 7 images. A question came up about how this protocol compares to the QBall protocol being used at MGH, that perhaps Dave Tuch could address.
  7. UNC’s approach, much like that of Utah, (and perhaps that of the larger ITK community), has been to build individual applications for each task based on ITK classes, and then to use their AVS-like dataflow tool to tie together inputs and outputs of the various modules. This is a different approach than the Slicer, which has a “plug-and-play” architecture, where different applications are integrated as modules into the basic Slicer visualization application. Discussing these two approaches also brought to the foreground an interesting point about the NAMIC toolkit – currently we have in it two applications: the Slicer and the LONI pipeline. Presumably, the Slicer will not need a pipeline tool since the model it uses is for the individual applications to be integrated into it as independent modules, while stand-alone apps like the ones UNC and Utah are used to writing need a dataflow management tool to string together various one-function applications. This might be a good topic for discussion at the SLC AHM.
  8. UNC group is enthusiastic about a week-long summer retreat for programmers.
  9. We also talked about continuing smaller working meetings at each site on a rotating basis after the Feb SLC AHM. For example, a smaller shape analysis meeting could be held at GATech or UNC in late spring (after MICCAI submissions) where core 1 researchers could share their latest algorithmic developments.