Project Week 25/Improving Depth Perception in Interventional Augmented Reality Visualization/Sonification

From NAMIC Wiki
Jump to: navigation, search
Home < Project Week 25 < Improving Depth Perception in Interventional Augmented Reality Visualization < Sonification


Back to Projects List

Key Investigators

Project Description

Objective Approach and Plan Progress and Next Steps
  • Disuss state of the art of "visual" AR for interventional radiology/surgery
  • Find and prototype appropriate auditory display methods for efficient depth perception feedback
  • Discover optimal mix between auditory and visual feedback methods for depth perception
  • Discussion about possibilties to support depth persception
  • Implementation of new depth cues
  • Enabled kinetic depth cues for Needle Navigation Software: direct control of scene camera movement with head position tracking
  • Sound connection could not be implemented during this week (had to rework parts the own code base first)
  • Started to implement head tracking -> need some different hardware
  • Next steps will focus on head tracking and fusion with sonification

Background and References

A Survey of Auditory Display in Image-Guided Interventions

Improving spatial perception of vascular models using supporting anchors and illustrative visualization