Difference between revisions of "CTSC Ellen Grant, CHB"

From NAMIC Wiki
Jump to: navigation, search
Line 73: Line 73:
 
  Show all diffusion studies where patients ages are < 6”
 
  Show all diffusion studies where patients ages are < 6”
 
* Scripting capabilities: Scripts need to query and download data into appropriate directory structure, and support appropriate naming scheme to be compatible with existing processing workflow.
 
* Scripting capabilities: Scripts need to query and download data into appropriate directory structure, and support appropriate naming scheme to be compatible with existing processing workflow.
 
Run processing locally, on cluster, etc.
 
* Mange post-processed images
 
* Share images with clinical physicians;
 
* Export post-processed data back to clinical PACS
 
  
 
== Target Processing Workflow (Step 2) ==
 
== Target Processing Workflow (Step 2) ==

Revision as of 13:44, 27 July 2009

Home < CTSC Ellen Grant, CHB

Back to CTSC Imaging Informatics Initiative


Mission

Use-Case Goals

The use-case can be broken into two distinct steps, including Data Management, and Processing.

  • Step 1: Data Management
    • Describe and upload retrospective datasets (roughly 1 terabyte) onto the CHB XNAT instance and confirm appropriate organization and naming scheme via web GUI.
  • Step 2: Query Formulation
    • making specific queries using XNAT web services,
    • data download conforming to specific naming convention and directory structure, using XNAT web services
    • generating and uploading specific processing results using XNAT web services.
  • Step 3: Data Processing
    • Implement & execute the script-driven tractography workflow using web services

Outcome Metrics

Step 1: Data Management

  • Visual confirmation (via web GUI) that all data is present, organized and named appropriately
  • other?

Step 2: Query Formulation

  • Successful tests that responses to XNAT queries for all MRIDs given a protocol name match results returned from currently-used search on the local filesystem.
  • Query/Response should be efficient

Step 3: Data Processing

  • Pipeline executes correctly
  • Pipeline execution not substantially longer than when all data is housed locally
  • other?

Overall

  • Local disk space saved?
  • Data management more efficient?
  • Data management errors reduced?
  • Barriers to sharing data lowered?
  • Processing time reduced?
  • User experience improved?

Fundamental Requirements

  • System must be accessible 24/7
  • System must be redundant (no data loss)

Participants

  • sites involved: MGH NMR center, MGH Radiology, CHB Radiology
  • number of users: ~10
  • PI: Ellen Grant
  • staff: Rudolph Pienaar
  • clinicians
  • IT staff

Data

Retrospective data consists of ~1787 studies, ~1TB total. Data consists of

  • MR data, DICOM format
  • Subsequent processsing generates ".trk" files
  • ascii text files ".txt"
  • files that contain protocol information

Workflows

Current Data Management Process

(schematic to come...)

DICOM raw images are produced at radiology PACS at MGH, and are manually pushed to the PACS hosted on KAOS resided at MGH NMR center. The images are processed by a set of PERL scripts to be renamed and re-organized into a file structure where all images for a study are saved into a directory named for the study.

DICOM images are currently viewed with Osiris on Macs.

Target Data Management Process (Step 1)

Step 1: (schematic to come) Develop an Image Management System for BDC (IMS4BDC) with which at least the following can be done:

  • Move images from MGH (KAOS) to a BDC machine at Children's
  • Import legacy data into IMS4BDC from existing file structure and CDs

Target Query Formulation (Step 1)

  • Develop Query capabilities using scripted client calls to XNAT web services:
Show all subjectIDs scanned with protocol_name = ProtocolName
Show all diffusion studies where patients ages are < 6”
  • Scripting capabilities: Scripts need to query and download data into appropriate directory structure, and support appropriate naming scheme to be compatible with existing processing workflow.

Target Processing Workflow (Step 2)

Step 2: (schematic to come)

Other Information

Rudolph has worked with XNAT support group at Harvard.

  • 7/25/09 - Rudolph and Wendy are beginning experiments to upload representative data and metadata to CHB's XNAT instance.