[open] Opensesame on a raspberry pi
Dear Dr. Mathot (beste Sebastiaan), and other forum readers,
This is a quick 'proof of principle' post... opensesame seems to work fine on the raspberry pi running the pi-specific Debian distro 'raspbian'. There is a little fighting to be done with dependencies, mostly having to do with installation of PyQt and Qscintilla, but nothing that takes too much effort.
One limitation: though the pi has a video card that supports hardware acceleration with Open GL ES, this is not currently supported in the xwindows environment. The RPi forums give me the impression that it is a matter of time before this will become available... but in the meantime graphics in xwindows rely solely on the cpu. So in general the RPi xwindows environment is clunky and slow. There are workarounds to directly access the GPU, bypassing xwindows, and these are used to create media players that will happily render 1080 p video with no problems. But these workarounds are unlikely to play nice with opensesame or python. Even when hardware acceleration is developed, this is likely to be Open GL ES... so the OpenGL-reliant backends available for opensesame probably won't work 'out of the box'.
But the legacy backend works fine and experiments that work with this backend seem to run just fine on the Pi. I don't know if timing on the Pi will be any worse than is normal with the legacy backend... so far I haven't done any 'real' tests. But when I do 40 minutes of the Theeuwes 1992 additional singleton paradigm demo in opensesame on the pi, the results look almost exactly the same as they do when I complete this task in eprime on an entry-level dell PC. This is a laughably-bad benchmark, but I find it reassuring.
It's a very, very cheap option for a stimulus machine. Even figuring in the cost of a DVI or HDMI monitor, you can have a functioning stimulus system for about 100 euro. Which I like a lot.
all the best,
clayton
Comments
Hi Clayton,
Good to see you on the forum! This sounds pretty good.
And even that would only be necessary if you need the full GUI on the raspberry pi. The runtime environment doesn't require PyQt4 (although you need to use the command line to start the experiment in that case).
Yes, you'd probably have to create a new OpenGL ES based back-end.
It should be the same, in the sense that the timestamp of display presentation can be off by one screen refresh. Preparing the stimuli on a slow CPU might take a lot longer, of course, but (for a properly built experiment) that only lengthens the intertrial interval.
Are you thinking of actually using raspberry pi's in the lab?
Thanks for looking into this! Maybe I should pick one up as well.
Check out SigmundAI.eu for our OpenSesame AI assistant!
Hi Sebastiaan,
Post your address at the lab in Paris and I'll send one to you. Consider it a combination of congratulatory gift for your cum laude degree and bribe to test/develop for this platform.
I am thinking about using the RPi in a small behavioural lab (ie. three or four desks in an unused room). This would be for use by a couple of students who are finding it tough to schedule enough time in the existing communal labs. I'm working on the IT department here to give me a few smaller DVI monitors leftover from their last upgrade cycle... so total outlay should be just the cost of the computers, at around 25 euro per piece.
I spent a bit more time with the RPi yesterday evening... OpenGL applications will run using software emulation of the GPU (through Mesa). It's horrendously slow; glxgears reports about 13 fps. It could be that an openGL backend running through Mesa will still provide better timing than the legacy backend, so long as the stimuli are static.
I also overclocked the CPU. I haven't played with this too much... apparently you can overclock from factory-set 700 mHz to 1 gHz. I upped it to 900 mHz and haven't yet noticed any problems. There is a noticeable improvement in xwindows.
ciao, clayton
This is great stuff! I'm thinking about using Pi's for educational purposes (a teeny-tiny PC for every student to program on; resolves all the troubles one might have with university IT departments and it provides students with a very cool hands on experience with dealing with computers). Might I ask via which way you've ordered yours?
Thanks in advance!
Edwin
Hi Edwin,
I ordered through RS components. They apparently were very slow to deliver last year, but my order arrived to Italy within a week.
I noticed that I made a mistake in my last post... price per unit is 25 pounds, not euro. Another few pounds for shipping... one unit delivered cost me 35 euro. That's for the 512 mb type B revision 2, which was released late in 2012.
I would have bought one even if I didn't see a research application... it's a really fun toy.
clayton
Cheers!
Educational purpose is a bit of an excuse to get my hands on one anyhow I thought I'd buy one first to see what the possibilities are and then check if there is some money somewhere to start a course. Your experience with the legacy backend is therefore a very welcome one!
Now we're on the subject: do you know if there is a Python package that supports OpenGL ES?
Thanks again,
Edwin
Hi Edwin, nothing's yet been widely adopted but there are a couple of projects. One is from Riverbank software, which probably means it will be further developed and supported:
http://pypi.python.org/pypi/pogles
Are you already mulling over the idea of a new opensesame backend? If I understand correctly, openGL ES is supported on a lot of tablets / phones, so this would be useful outside the strict confines of the Rpi. Running an experiment on your phone might be odd, but open GL ES support could lead to opensesame running on cheap non-openGL-supporting linux tablets.
ciao, c.
Well, if you have one to spare, I'll gladly accept! Thank you. I'll drop you an email with my "Paris" address ;-)
Yes, that might very well be. The most important thing is that the display timestamps really correspond to the time at which the refresh cycle starts. This is more or less orthogonal to the question of whether the graphics are fast. I have some equipment here which I can use to test this.
That looks pretty good and backed by a solid name. If programming in OpenGL is any indication, creating an OpenGL ES backend might not be trivial, though. But at any rate, we can think about that later, and perhaps ask some help from Daniel or Florian (expyriment), who have some experience in this department.
Check out SigmundAI.eu for our OpenSesame AI assistant!
I'm apparently not the only one interested in this:
http://akiraoconnor.org/2013/02/01/using-the-raspberry-pi-to-run-cognitive-psychology-experiments/
Ah, indeed. Great to see that they mention OpenSesame
Check out SigmundAI.eu for our OpenSesame AI assistant!
Thanks Clayton, I just received it today!
Check out SigmundAI.eu for our OpenSesame AI assistant!
Well, a few days and a few benchmarks later: http://www.cogsci.nl/blog/miscellaneous/216-running-psychological-experiments-on-a-raspberry-pi-with-opensesame
Check out SigmundAI.eu for our OpenSesame AI assistant!