Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Memory error

edited February 2016 in OpenSesame

Hi,

I am currently running an old version of OpenSesame (2.8.4/2.9) using the psychopy backend. The experiment is relatively long (around 3000 trials) and uses simple geometric shapes and colors (circles, lines, no pictures or other preloaded files). However, towards the end of the experiment I am getting memory errors and the experiment crashes.
My questions thus: Which options do I have to reduce the memory load?

I currently draw the canvas in the prepare phase and only present them in the run phase but I redraw them every single time. Would deleting the canvas after presenting it in the run phase maybe solve the problem? Or differently asked, does the program create a new memory location for each canvas I draw on every trial or does it overwrite the existing one? What other options are there that may significantly reduce memory load (other than further simplifying coding and avoiding loops and to store a lot of variables)?
I thought I'd ask before I do it trial and error style (since the experiment takes that long)...
Any help is appreciated.

Cheers,
Michel

PS: For certain reasons (does not really matter why), I cannot switch to the current version of Open Sesame either.

Comments

  • edited 12:21AM

    Hi Michel,

    If your canvas has the same name in every trial, it's just a single object and should therefore not increase the memory load. The cause of your memory problem depends on the design of your experiment. What kind of data do you log, and how do you create your stimuli?

    Cheers,

    Josh

  • edited February 2016

    Hm... my experiment is actually very simple.

    I only log integers, strings, a single list of 10 tuples (size 2) and a dictionary with 4-5 lists (all no larger than size 10). I cannot imagine that this is the issue.
    I create my stimuli using largely in-build functions (canvas.lines, canvas.circle, etc.). There is no other excessive use of other functions.
    The fact that the program is rather simple makes me think it may have something to do with overwriting (instead of deleting or canvas.clear - ing my canvas) rather than something about my stimuli.

    I'd love to switch to pygaze for more stability but since it is an EEG experiment I'd prefer to stick to psychopy backend.

  • edited 12:21AM

    Hi Michel,

    With overwriting, do you mean that you keep on adding shapes to the same canvas without clearing previous shapes between trials? If so, then I think that may be the problem. Is it possible to add a simple canvas.clear() at the end of your trial?

    Cheers

  • edited February 2016

    I create the canvas in the prepare phase (self.offline) under a certain reference (exp.canvasA) on Trial A. Then I fill that canvas with certain stimuli (line and circle elements) and present it in the run phase. Then I do exactly the same in the next Trial B. That is, the canvas from Trial A is not deleted or cleared. I create a new canvas on Trial B with the same reference (exp.canvasA) that I used in Trial A. So I wondered, whether in this case, the canvas from Trial A has a separate memory trace than the canvas created in Trial B and I just use the same variable name (i.e. reference -- exp.canvasA) for a different memory trace or whether I actually overwrite the existing one (i.e. use the same memory trace). I suspect the former, which would explain why there is at some point just a memory overload.

    I could indeed just create the canvas (self.offline) outside the trial loop and clear the canvas at the end of each trial and fill it again on the next. Or create a new canvas on each trial like I do right now, but delete the existing one after I presented it. I suspect that both approaches would solve the problem, if I create separate memory traces for each created canvas instead of overwriting them.

  • Just to update this post:
    I fixed the issue but clearing the canvas and creating as few canvases/vars as possible outside the loop. Since then I do not have this issue anymore.

  • Hi Michel. This is a known problem with the OS + psychopy combination (ask Berno or Jarik about it). Apparently it can be solved by doing some explicit garbage collection once in a while:

    import gc
    ... 
    gc.collect()
    

    Note that this operation can take up to a few seconds, so better not to do it during time critical phases (but rather before the breaks etc). Hope this helps. And even though you said you didn't want to upgrade, I think this problem is fixed from v3.1 on.

  • And even though you said you didn't want to upgrade, I think this problem is fixed from v3.1 on.

    That's right, there is now an experimental variable called disable_garbage_collection which, when set to 'yes' (the default), causes OpenSesame to disable automatic garbage collector during the experiment, and explicitly do garbage collection at the end of every run phase of a sequence.

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

Sign In or Register to comment.