Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

OpenSesame to unity

Hi, I am converting an existing experiment in OpenSesame I want to run using a set of VR glasses which run on Unity. I know Unity runs on C# but there are 3rd party addons to use Python with unity.

My question is twofold:

  1. is there a way to use OpenSesame through Unity?
  2. is there a way to get a pure python script of an OpenSesame experiment?

Thanks,

Jordan.

Comments

  • Hi @Jordan ,

    I'm assuming that you would like OpenSesame to control the flow of the experiment, and then present stimuli in a virtual environment controlled by Unity. Is that correct? That would be very sweet. I have never worked with Unity myself, though, so I don't know how difficult this is.

    is there a way to use OpenSesame through Unity?

    Let's take a step back first. What would you like to do, and how would you do that programmatically (leaving OpenSesame aside for a moment). For example, if you want to present stimuli, then how does that work in Unity? Can you use Python to dynamically present some shape in the virtual world? And if so what would the corresponding Python script look like?

    is there a way to get a pure python script of an OpenSesame experiment?

    No, experiments are not compiled to Python scripts.

    — Sebastiaan

  • edited February 2023

    Hey @sebastiaan , sorry for the delay in response.


    Essentially my lab is looking for ways to use OpenSesame with a VR headset that has the ability to track eye-movement (which is really important to us as you can probably understand). Unfortunately the only one we were able to find is based on C#.

    Now we are talking with them about maybe being able to just circumvent the whole move to unity with the ability to basically use the screens in the system as basic pc screen onto which we will project our experiments from something like (hopefully actually) OpenSesame.

    If we have to work through the Unity workframe I do know there is something called the Out-of-Process API which could be used to integrate third party packages (https://docs.unity3d.com/Packages/com.unity.scripting.python@2.0/manual/outOfProcessAPI.html), but admittedly it's hard for me to completely wrap my head around it.

    Any help or light you can shed on this matter would be extremely helpful.

    Thank you again!

    Jordan.


    (edit: added the last 3 sentences)

  • Hi @Jordan ,

    It sounds like what you want to do is well within the realm of possibility. But it also sounds like it will take technical know-how and an understanding of the various technologies. So I would make sure that there is someone in the team who has this!

    As far as I can tell there are three options, each with their own advantages and disadvantages:

    • Easy but limited. It sounds like OpenSesame can simply present stimuli on the headset as though it's a regular monitor. When doing so, I assume that the left side of the screen goes to the left eye and the right to the right eye. It will then be up to you to create sets of stimuli that actually work together as 3D stimuli, for example by rendering the left and right viewpoints in software like Blender.
    • Somewhat tricky but more flexible. It would also be possible to have OpenSesame automatically generate such split-half displays by extending the Canvas capabilities in a new backend. This would offer more flexibility but also require a fair bit of programming.
    • Hard but ideal. Ideally, OpenSesame would communicate with Unity (probably indeed through the out-of-process API) and take control of the stimuli. This would be ideal in terms of VR, because it would mean that you can take full advantage of all the functionality that Unity offers.

    These suggestions are purely based on how I know that these things generally work, because I haven't worked with Unity myself yet. But I nevertheless hope this gives you some idea of the options.

    — Sebastiaan

Sign In or Register to comment.