Oculus Rift interface?


It would be great to be able to visualize 3D datasets (medical scans, confocal immunofluorescence images etc) via the Oculus Rift (particularly if the Touch controllers could be used to rotate the image, zoom in and out etc). Is anyone aware of any work being done on that?

It is possible I could interest some undergrad engineers in working on this as a design project, but I don’t have the necessary experience to lead such an effort.




@skalarproduktraum and @kephale are working on a new 3D visualization in ImageJ consisting of the scenery renderer and the SciView plugin:


So they might be able to comment further.

The mentioned projects are quite advanced, and I doubt that even as a “design project” this would be suitable for an undergraduate project (also because such kind of projects require good maintenance over years to succeed), but the authors might know best where some help is needed.


@ungrin We have Oculus Rift, Vive, and even CAVE support in Scenery and SciView already.

What we do not have is full integration of ImageJ features with these interfaces, and the more hands the better. If you would like to talk in more detail, then @skalarproduktraum and I would be happy to chat about more specifics and what is doable within the scope of an undergrad design project.



Hey @ungrin,

I agree with @kephale and @imagejan (after working on this for quite a bit now :D) that writing a whole new visualiser with VR support might be a little much for an undergrad project. Getting a basic version working is easy, but the devil is really in the details.

That being said, I at the moment have a very capable intern who has integrated new ambient occlusion and shadowing algorithms into scenery, and along these lines I could imagine feasible undergrad design projects for your students, such as:

  • integration of an editor for multi-D transfer functions, which could be then adjusted with the Vive or Oculus controllers
  • better interfacing with ImageJ, as @kephale suggested, e.g. to have all the ImageJ commands (e.g. for acting on volumetric data) available in a “floating” VR menu
  • integration of non-photorealistic rendering, which can be very useful if you want to emphasize outlines, etc. in renderings
  • (your idea here :D)

The only requirement for the student would be to be comfortable programming, and accept a steep learning curve. Everything graphics-related usually comes with a high amount of mathematics (mostly linear algebra, though).

I hope this was helpful, we’re of course open for more discussion about topics, and so on :+1:




Thanks for the replies. That sounds about right, they are unlikely to be much help with longer term maintenance. I also get students interested in thesis projects at times, but as my background is tissue engineering I don’t think I could provide the requisite expertise to supervise such a student - the nice thing about the design teams is they get support from the faculty in their own departments, so my role is to publicize an interesting project and point them in a useful direction. The integration of ImageJ commands might be a good fit - design and implement an IJ menu system usable within a VR environment. I would probably put them in touch with you guys directly so that what they come up with fits with your views on overall architecture, if that’s OK.


PS not sure about topic tracking so replying to self to tag @imagejan @skalarproduktraum and @kephale


Please do! Looking forward to hearing from you and your students.



Yes, please do! We also have a Gitter chat at gitter.im/scenerygraphics :slight_smile:

Sidenote: I got a ping from the thread, but didn’t have a chance to reply until now. You always get a notification here if you reply in a thread.