Difference between revisions of "Project Week 25/Next Generation GPU Volume Rendering"
From NAMIC Wiki
Line 81: | Line 81: | ||
** sharing across contexts | ** sharing across contexts | ||
** use as input textures and render targets | ** use as input textures and render targets | ||
+ | ** tiling? | ||
+ | ** streaming? | ||
+ | * Large volumes? | ||
+ | * Mesa backend? |
Revision as of 14:43, 13 June 2017
Home < Project Week 25 < Next Generation GPU Volume Rendering
Back to Projects List
Key Investigators
- Simon Drouin (Montreal Neurological Institute, Canada)
- Steve Pieper (Isomics Inc., USA)
- Andras Lasso (Queen's University, Canada)
- Ole Vegard Solberg (SINTEF, Norway)
- Alvaro Sanchez (Kitware, USA Remote attendance
- Sankhesh Jhaveri (Kitware, USA Remote attendance
Project Description
Objective | Approach and Plan | Progress and Next Steps |
---|---|---|
Develop a specification for the next generation of GPU volume processing and rendering in VTK and Slicer The specification should support
|
Compare the architectures of different existing projects where parts of the required functionality has been implemented:
Determine a sensible way to integrate all those contribution in VTK. |
TODO |
Illustrations
3D Image Filters in WebGL2
Multivolume rendering and nonlinear transforms in WebGL2
Background and References
Some notes about sharing GLSL code between desktop OpenGL and WebGL
Recent features in VTK Volume Rendering (in master)
- Volume peeling - translucent geometry with volumes in GPU ray cast mapper
- Render to texture
- 2D lookup tables (value and gradient magnitude)
Work in progress
- Overlapping volumes - multiple inputs to mapper
- vtk charts to work with 2D transfer functions
Slicer to migrate to latest version once cmake hierarchy is sorted out.
Questions:
- Ray cast vs view aligned plane-based algorithms
- depth of focus, shadows, diffuse lighting...
- How to integrate multiple features
- Nonlinear transformation
- Custom shaders
- Dynamic shader generation in python
- multiple components
- RGBA
- Independent components?
- 2D lookup tables
- Volumes that live on the GPU
- sharing across contexts
- use as input textures and render targets
- tiling?
- streaming?
- Large volumes?
- Mesa backend?