Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Stereoscopic paradigm development

Hi guys!

I am into develop a stereoscopic paradigm with this glasses used on proper HW and screen.

http://www.nvidia.com/object/product-geforce-3d-vision2-wireless-glasses-kit-us.html

My question is if OpenSesame is capable of this. I do simple research on which software should I use. I felt literally in love with OpenSesame so I would like to stick with it.

My goal is to use some 3D engine with stereoscopic abilities to let participant navigate trough some scenes. If this will be problem I will use very simple images.

My questions are:

1) Does OS capable of this kind of stuff.
2) Which library of module do you recommend for it.
3) If it is a bad idea what tool would you take? It is a EEG paradigm so I need as accurate timing as possible but yeah I know it is experimental and requires a lot of HW resources.

Thank you for your kindly advice!

Comments

  • Hi Michael,
    It is possible. I have programmed some Continuous Flash Suppression experiments in the past that made use of binocular disparity, and which we ran using an Oculus Rift. You can find an example of on of these experiments attached with this post, but I think it is too simplistic for what you need. Letting a participant navigate through some scenes in a 3D world is going to be quite some work, and there are no out-of-the-box tools in OpenSesame that can help you easily set this up. OpenSesame comes packaged with pyopengl, which you can use to program your own 3D world, but it is going to get quite technical and time consuming if you want to do so, and there's not much we can do to help.

    Buy Me A Coffee

  • @Daniel Thank you a lot. What do you mean example attached with this post? I guess I cannot find any...

  • Sorry, either something went wrong, or I forgot to attach it. The experiment crashes somewhere at a later stage, because it was built with an older version of OpenSesame and is apparently not forward compatible, but still, it should give the right impression on how to approach this problem. I do still think this example is too simplistic, since you are talking about wanting to build a whole 3D world. You may want to check out http://rifty-business.blogspot.com/2013/09/a-complete-cross-platform-oculus-rift.html or https://developer3.oculus.com/documentation/pcsdk/0.5/concepts/dg-render/, but we warned, it is going to get quite technical!

    Buy Me A Coffee

  • Thank you daniel, I'll try to do my best. If it will be possible I'll send there my solution.

  • Hi Michael,
    If it's navigation through a 3D environment you're after, you might want to consider using the Unity 3D engine. This engine is specialized in stereoscopic rendering, and it is relatively simple to create 3D environments in it.
    Having said that, it is not designed as experimental software by any means and adapting it to your needs probably requires some coding in either C# or javascript.
    With regards to EEG, it is possible to use that in conjunction with Unity and this has been done before. I've attached a thesis in which an experiment is described that involves navigation in a 3D environment using an oculus rift DK2. The scripts(including a script for an event exchanger) are included in the appendages.
    Hope this helps!

Sign In or Register to comment.