Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

[solved] opengl backend / opensesame for android

edited June 2013 in OpenSesame
First of all Thanks for this nice opensesame...

I'm really new in opensesame and I'd like to be able to use the opengl backend. Using that opengl mode, I noticed that the first screen display was not displayed and appeared blank (black screen). Even if a small blank screen place before my instruction slide solves this issue I wonder if there is something that can be done to improve this.

I have a second and more important issue... The problem is that I also use videos in my experiments so I use the media_player that requires the legacy mode and does not support the opengl one. Is there something possible to add opengl mode to the media player?

Best regards,
Olivier.

P.S: I have some programming skills so i can (and I'd like to) give a hand but I afraid that python is not a language I know.

Comments

  • edited 9:01PM

    Hi Obrousse,

    Thank you for your interest and welcome to the forum!

    Regarding the first display not being shown with the OpenGL back-end. Could you provide some details on the system that you are using? And do you observe the same behavior with all experiments? It's not a big issue, I suppose, but it should be addressed if it happens on a lot of systems.

    Regarding the media_player and the OpenGL back-end. I don't think they will be compatible in the (very) near future, although Daniel (who maintains the media_player) is considering a different implementation that may support other back-ends as well. For now, you could look into using the PsychoPy back-end (also OpenGL-based), in which case you could use the PsychoPy video routines (I haven't tested those myself, though): http://osdoc.cogsci.nl/back-ends/psycho

    Alternatively, you could stick to the legacy back-end. The only downside is slightly less temporal precision (still in the order of milliseconds, though), but for videos I think it should be fine. Or is there a specific reason that you want to use the opengl back-end?

    Finally, regarding your last point, if you have some code contributions (or documentation or whatever) to share, that would be very helpful! You can find some more information here: http://osdoc.cogsci.nl/developer-information/how-to-contribute-code

    Kindest regards,
    Sebastiaan

  • edited 9:01PM

    Thank you Sebastian,

    I'll be happy to participate and contribute as soon as I'll have a significant amount of things done.

    Concerning my system I run a "Fedora 15" linux distribution in its 64bits version. Display in made thanks to a quadro NVS 295 and I have an additional GPU board reseerved for GPGPU computing (Tesla C1060) that can't be used for display.

    Some additional information:
    X.Org X Server 1.10.4
    X Protocol Version 11, Revision 0
    Kernel: 2.6.32-131.2.1.el6.x86_64

    Python: 2.7.1

    If you need more do not hesitate to ask me.

  • edited 9:01PM

    Actually the opengl back-end in important for ma as I need high and precise refresh rates for the experiments I do. Small videos are show at refresh rate up to 160Hz (this is made on CRT monitors) so I really need the precision of this back-end.

    I'll try the PsycoPy back-end but I have been frozen by the advert you've done on it.

  • edited 9:01PM

    I also have to be honest and say to you why I'm trying OpenSesame. I'm searching a psychological experiment platform that I can port to android powered tablets. I found OpenSesame quite easy to used and people in my lab may appreciate that as they may have to use it for experiments on this kind of hardware.

    So if I achieve to port the runtime part to android (maybe possible with your unexpected help ^^) I'll contribute to OpenSesame in this way if you agree with that.

    reagrds,
    Olivier.

  • edited 9:01PM
    Actually the opengl back-end in important for ma as I need high and precise refresh rates for the experiments I do. Small videos are show at refresh rate up to 160Hz (this is made on CRT monitors) so I really need the precision of this back-end.

    Ah, I see. Perhaps in that case you could consider not using a real video mechanism at all, but simply pre-loading all frames as different canvas objects and showing them one by one. If the videos are not too long that should be doable (memory-wise) and I think that this will give you more control over the process of video playback. Would this be an option?

    I'll try the PsycoPy back-end but I have been frozen by the advert you've done on it.

    What advert do you mean? It's certainly not my intention to scare people away from the psychopy back-end!

    So if I achieve to port the runtime part to android (maybe possible with your unexpected help ^^) I'll contribute to OpenSesame in this way if you agree with that.

    Yes! I was actually thinking about this myself. Although I haven't done any actual work on this, I found the PyGame subset for Android. It might well be possible to create an 'Android' canvas back-end using this. I'll certainly answer any questions you might have.

    Porting the GUI to Android is probably not feasible at this point. The GUI uses Qt4, and as far as I know Qt is not available for Android.

  • edited 9:01PM

    Actually Qt is being ported, the project is called "necessitas" and is supported by a library auto loader that automatically downloads required libraries (called ministro). I'm quite sure that we can used that project that is at first sight working well.

    The problem maybe to port some of the python libs that are used for opensesame runtime.

    I'm trying to document myself on how I can creat an egg project with only required things for opensesame runtime. Cause it looks like it is an easy way to port python based libs to android (Py4A project on google code help to port python libraries proposes to do it in this way.

    Regards,
    Olivier.

  • edited 9:01PM
    Actually Qt is being ported, the project is called "necessitas" and is supported by a library auto loader that automatically downloads required libraries (called ministro). I'm quite sure that we can used that project that is at first sight working well.

    I didn't know about that. Looks promising!

    The problem maybe to port some of the python libs that are used for opensesame runtime.

    Yes, some libraries will no doubt be missing. But for a basic canvas the PyGame subset for Android may be enough.

    I'm trying to document myself on how I can creat an egg project with only required things for opensesame runtime.

    That would be useful in its own right. Please let me know if you manage to do that!

  • edited 9:01PM

    I have a way to integrate pure python scripts into an apk package. That may be a start for distibuting opensesamerun to android.

    I only miss the complete dependence tree of opensesamerun and of course the android back-end that may take quite a long time to setup (hope not). Do you kown a way to get all dependence of opensesamerun? (sorry for this newby question)

  • edited March 2013

    I'm guessing that the easiest way is not to use the opensesamerun app per se, but simply create a native android gui (or maybe even only a simple Python script to begin with) to create and run an opensesame experiment. That way, there are really only very few dependencies.

    Basically, all you need is Python and PyGame. In the code you will see loads of import statements, but almost all of them refer to core modules that are included with Python (sys, os, shlex, etc.). In order to use sound (synth and sampler), you will also need Numpy, but the video (canvas) should be fine without.

    The legacy back-end is built around PyGame. Perhaps it will run without modification with the PyGame subset for Android. The only that worries me is the "subset" part. This must indicate that some stuff is not implemented, but I have no idea whether that will be problematic for OpenSesame.

    It might be useful to know that the GUI is separated from the runtime (although I'm guessing you already knew that). If you don't want the GUI, you can essentially do away with the "libqtopensesame" folder. And running an experiment from script is a simple as:

    from libopensesame.experiment import experiment
    exp = experiment("Experiment title", "an_experiment.opensesame.tar.gz")
    exp.set_subject(0)
    exp.fullscreen = True
    exp.logfile = "A logfile"
    exp.run()
    

    Hope this helps!

Sign In or Register to comment.