It looks like you're new here. If you want to get involved, click one of these buttons!
Has anyone conducted a reading study using PyGaze (ideally with an SMI Hi_Speed eyetracker)?
We're competely new to OpenSesame and would really appreciate some expert advice...
Thanks fo any tips,
Perhaps the best thing to do first would be to get acquainted with OpenSesame with an easy tutorial: http://osdoc.cogsci.nl/tutorials/attentional-blink/
Here, make sure you understand the mechanics of the inline_script item; this is where you write all the code essential for your experiment. All the relevant PyGaze functions are found here: http://osdoc.cogsci.nl/devices/pygaze/#function-overview
Right above this list of functions is an example, that you can try to run using an inline_script. The idea of this example is to track the eye position, and place a dot onscreen at the eye position each time the spacebar is pressed. If you succeeded in doing this we could discuss your experiment in better detail!
thanks for the helpful tips so far. I think in general we are getting a good handle on OpenSesame, but what I'm a bit worried about is that we might spend a lot of time getting to know the program and still in the end find that it is not really suited for a reading study with our eyetracker. Any thoughts on this (ideally reassuring ones :-) i.e. that anyone has worked with reading measures and/or with an SMI Hi-Speed before)?
For instance, one question that has come up so far is that we haven't yet found whether we can present text instead of images as stimuli - our items consist of 34 sentences. One possible workaround would of course be to create images of our sentences, but I'm not sure whether this will make the analysis of reading measures (mainly fixation times on individual words) problematic?
Well, reading is my field of research, and I intensively worked with an eye tracker (EyeLink) and OpenSesame throughout the last year, so I can reassure you that OpenSesame is a suited program and also that it's not difficult
Yes, of course you can present texts as stimuli. I suggest you browse this page for a bit: http://osdoc.cogsci.nl/
All of OpenSesame's functionality is documented here. Especially important are the canvas functions (under the "Python Inline Script" category), including one that generates text.
Further, there is a function like text_size(), through which you can calculate coordinates, for instance to determine the onset of a fixation on your target word, or for using Rayner's boundary technique (which I'm sure you're familiar with?).
In sum, I haven't had difficulties in getting a sentence reading experiment to run; I'm sure you guys will manage to do the same - and in the meantime I'll be here in case you come across problems.
great, thanks, that's the reassurance I needed to keep going :-)
We'll be in touch as soon as we have a basic experiment running - thanks for your help already!
Okay, one quick maybe trivial question:
So far we've used our tracker (SMI iViewX HiSpeed) in a two PC-setup (so there is a display PC presenting the stimuli to the participant (using either E-Prime or SMI ExperimentCenter) and an experimenter PC driving the experiment, calibration etc. Is this be possible with PyGaze too, or would you recommend using a one-PC setup?
It looks like a 2 PC setup should work: In pygaze_init, you can enter the IP address of the computer that is connected to the SMI.
I feel I need to add a disclaimer though: SMI support is part of PyGaze, but it is not an eye-tracker that we often (or at all, in my case) use. So I would first make sure that you can get the connection with the SMI to work in the context of a dummy experiment. And, if not, report back so that we can help you.
Once you've got the connection with the SMI, building the experiment won't be too difficult, as @Josh already said.
great, thanks. We'll try that step by step, whenever the lab is free.
Don't worry about the disclaimer; your documentation made it fairly clear that SMI (in particular I think the Hi-Speed column rather than the remote RED version) is not the most supported eyetracker. That's why my first question was so tentative, asking whether anyone else has experience with it. But it sounds doable and we've got several people here interested in getting the setup working now, so I hope we'll get it running relatively soon.
Hi Josh, Sebastiaan (and anyone else)
while my students are busily creating the basic experiment, I've been trying to get the connection with the eyetracker working. So far, they are more successful than me...
I don't quite understand Sebastiaan's suggestion of "get the connection with the SMI to work in the context of a dummy experiment" - isn't the idea of a dummy experiment that it doesn't require a connection with the SMI? This seems to be where I am stuck at the moment: I added the inline that Josh suggested at the start of the current experiment (see overview.jpg here: http://img.cogsci.nl/?q=564a1555cbcb8). If I do this as a dummy experiment (http://img.cogsci.nl/?q=564a16210b35f) it runs successfully, but not much happens - which I assume is obvious because the inline relies on the info from the tracker. If I set the experiment to SMI (http://img.cogsci.nl/?q=564a164fb8bb4), it fails with the error message visible on the image.
Any ideas what I may be doing wrong? Happy to provide any further information required.
I think what Sebastiaan meant with dummy experiment, is to just try to get the connection with the eyetracker going and create something simple to 'prove' that. As was done in this example: http://osdoc.cogsci.nl/devices/pygaze/#example
The dummy setting in the pygaze_init item is, like you said, to test your experiment without the eye-tracker, (which could of course also be useful). In such cases it would be more useful to set it to 'advanced dummy' rather than 'simple dummy' though, as with the advanced dummy setting your mouse will act as the eye position. It's no surprise that nothing happened in your experiment with the simple dummy setting.
thanks for the quick response and for clarifying about the dummy version. This morning's update: The "simple dummy" works (unsurprisingly) for the example you posted, the "advanced dummy" does as well (except that I could only move the mouse across the right side of the screen - maybe some settings wrong somewhere?).
I've also made some progress on getting the example to work with tracker set to "SMI" in pygaze_init. This post on the forum "[solved] problem loading iViewXAPI.dll" had two important suggestions which solved the previous problem, but now we get a new error message, which I can't find info on the forum about: "failed to bind sockets" (see http://img.cogsci.nl/?q=564b097ee10d5). Any ideas?
First to confirm what Josh said: With 'dummy experiment' I indeed meant a simple experiment to see if you can get the connection with the SMI working—not a full-fledged experiment with PyGaze in dummy mode (which, in a sense, is the inverse).
Regarding your error message: This indicates that pygaze cannot connect to the SMI. The most likely reason for this is that the settings are incorrect. Are you sure that you have correctly configured the SMI IP Address, SMI send-port number, and SMI receive-port number?
Brief update in case anyone else has this problem in the future: We've spent several days now fiddling with this, sadly unsuccessfully.
Sebastiaan's comment about the correct settings for IP address and ports made a lot of sense. I had originally transferred the settings that work for us with the SMI ExperimentCenter two-PC setup. Since that didn't seem to work, we tried a whole lot of other settings. From the error messages, I think the problem really is our 2-PC setup that I mentioned at the beginning: if I enter the IP address of the PC that runs the eyetracking software iViewX, I get the error message about failure to bind sockets (see above). If I try to use a 1-PC setup (entering the IP-address of the display PC), the message is that it can't find an iViewX installation - which makes sense, as iViewX isn't installed on that PC.
Today I tried to install iViewX on the display PC, but that doesn't seem to be included in the licence we originally bought with the tracker. I'm going to see if SMI is willing to extend our licence to the display PC, but if not (or if that requires extra funding which I don't have), I don't really see any further options for using this setup. I'd be really happy if anyone does come up with a solution now or in future, and will post if we ever make any progress.