Welcome!

Sign in with your CogSci, Facebook, Google, or Twitter account.

Or register to create a new account.

We'll use your information only for signing in to this forum.

Supported by

PyGaze and OpenSesame support for the new Tobii Pro SDK

EdwinEdwin Posts: 638

It's finally here, people! Straight from the Tobii developers themselves. For more info, see this post: http://www.pygaze.org/2017/09/pygaze-support-for-tobii-pro-sdk/

Thanked by 1sebastiaan

Comments

  • sebastiaansebastiaan Posts: 2,813

    That's great!

    Just to clarify: this functionality is not included with the current latest version of OpenSesame (3.1.9), but it can be installed afterwards by updating PyGaze.

    Once I'm back in the office I'll take a closer look.

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

  • neonneon Posts: 61

    That's super-good news! I look forward to using this soon.
    Thanks Tobii, thanks @edwin and thanks @sebastiaan !

    Neon

  • twitch202twitch202 Posts: 1

    Hello guys :)
    Quick question, how do you update PyGaze to the newest version? I have tried "import pip
    pip.main(['install', 'python-pygaze', '--upgrade'])" but it returns a message saying it is up to date.

    Would you just replace the old files within opensesame folder with new ones from pygaze.zip?
    Would greatly appreciate your response!

  • sebastiaansebastiaan Posts: 2,813

    The version of PyGaze on PyPi hasn't been updated yet, but indeed: manually replacing the current PyGaze package with the contents of pygaze.zip (linked to in Edwin's post) should do the trick as well.

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

  • brusilbrusil Posts: 13

    Hi, @edwin and @sebastiaan,

    I would like to, once again, thank you both for OpenSesame and PyGaze! Great tools!

    I'm both new to these tools and to eye-tracking so, please forgive any rookie mistakes on my part.

    Since I'm working with a Tobii eye-tracker, I am very much interested in taking advantage of the new Tobii SDK support.

    I have been able to use the new Tobii SDK in both PyGaze (alone) and OpenSesame.
    To use the new Tobii SDK support in OpenSesame, I manually updated (replaced) PyGaze package to match the latest changes in PyGaze's GitHub repository, installed both the old and the new Tobii SDKs and updated (replaced) OpenSesame plugins for PyGaze (so that both Tobii and Tobii-legacy appear in pygaze_init item).

    While both the legacy and the new option run in OpenSesame, I'm getting (what to me looks like) some unexpected behavior with the new one:

    1. The "track status box", at the beginning of the calibration process, is much less responsive in OpenSesame than in PyGaze and sometimes the text disappears and it freezes. As far as I could tell this does not happen in PyGaze.

    2. The text and the calibration points cannot be seen if the background color is set to white. In PyGaze the calibration screen colors apear to be independent from what is set in BGC and FGC variables, so they remain visible.

    3. If the option "Automatically log all variables" in a pygaze_log item is checked, there is a very noticeable lag (of seconds) in the experiment. (At first I was convinced that the lag happened on keyboard responses but this was due to the succeeding pygaze_log item). With Tobii-legacy there is no noticeable lag. Even for small messages, it seems that the logging events take many times longer with the new Tobbi SDK module. Looking at the time between a logging message signaling the start of a stimuli display and the subsequent sketchpad (with the "Automatically log all variables" option unchecked) in OpenSesame log, the time difference is of about 2.7 ms using Tobii-legacy and 44 ms using the new Tobii module.

    4. (This is the one that worries me the most) I have not been able to align the eye-tracking data with the experimental events. I have created pygaze_log events immediately before and after the stimuli display and, while the difference between the timestamps of these events matches the display time of the stimuli (~3000 ms), if I look into the slice of data that fits in-between these timestamps it misses more than one second of data.
      Even if I ignore the events and look into the blocks of data in-between the big time jumps that signal the transition from one trial to the next, the "continuous" blocks of data are not big enough to contain the whole trial (from pygaze_start_recording to pygaze_stop_recording). The impression I have is that there is actually data missing.
      It is also weird that, looking at the times in OpenSesame log, the difference between time_pygaze_start_recording and the subsequent item (a sketchpad with a fixation dot) is of over 1000 ms, as if the eye-tracker was taking over 1 second to start recording, while with Tobii-legacy this is about 6 ms.
      I have tried this both in PyGaze and in OpenSesame with (apparently) the same result.
      The eye-tracking data logs are substantially different between the old and the new Tobii SDKs. I've been able to align the data with the events in the old one but not the new.
      This might turn out to be just my inexperience but I did spend a very long time trying to make sense of this data and without success.
      The following link contains a very simple experiment in OpenSesame, the log files generated using the old and the new Tobii SDK and excel files evidencing what I explain here.
      https://www.dropbox.com/sh/rlg5jnftkdxujd7/AAAUzfWOx9zh5gAJdRUZpqlXa?dl=0

    Note: I'm using a Tobii Pro X2-60 eye-tracker in a Windows 10 operating system and the backend is PsychoPy.

    Sorry for the very long message!
    This all needs to be verified by someone who actually understands a little about this...
    Any help will be much appreciated! Thanks!

    Cheers,
    Bruno

    Thanked by 1neon
  • neonneon Posts: 61

    Hi @brusil,

    I'm in a similar position in that I'm using an X2-60 with Opensesame with the PyGaze plugin; here at work I'm running Windows 8.1 rather than 10, but other than that, pretty similar.

    I've run the latest version of PyGaze (via updating from the aforementioned pygaze.zip manually) and so far have noticed similar results to you for some things, namely:

    1. yes the track-status box seems a little jerky and erratic, but I've not found it unusable, nor have I noticed it freeze.

    2. The text and the calibration points cannot be seen if the background color is set to white - yes, this is a bit of a nuisance for me because I tend to use white on black by default as I prefer this. @edwin - do you think this is something that might easily/likely be addressed in the future?

    3. I don't usually log all variables in pygaze so haven't noticed this behaviour.

    4. I haven't had a decent look at what's come out yet (just to see if somthing has ) as I just wanted to have a quick check that the new plugin was working at the basic level. I have previously considered the best way to record in a previous post though and have concluded that I'm better off recording data for the whole trial block (including fixations, mask screens etc.) rather thank keeping using pygaze_start_recording and pygaze_start_recording. Will record an identifier in the log file so I can process the results later.

    If you've made any more progress do let me know. I'll also hav a more in-depth look at what's happening toward the end of the week.

    Best wishes to all,

    Neon

    Thanked by 1brusil
  • brusilbrusil Posts: 13

    Hi @neon

    I'm very glad to know that we are using the same eye-tracker model and that you'll look into this situation. Thanks!

    I haven't made any progress regarding the use of the New Tobii Pro SDK; I simply decided to use the old one.

    I'm also recording the whole trial block. In order to recognise when a stimulus is being presented I use a pygaze_log item immediately before and after the sketchpad with the stimulus, with the messages "start_stim" and "stop_stim". I then use the timestamps of these messages in the eye-tracking data file to get the slices of data corresponding to the stimulus display.

    I have created a new discussion regarding these issues.

    I am now working on parsing and analysing the (test) data.
    I've looked into PyGaze Analyser but it does not have a reader for the Tobii output and, as far as I could tell, it is mainly directed at producing raw/fixation/scanpath/heatmap plots, one participant at a time, and, as the source code warns, it employs a "very crude fixation and blink detection".
    I am working on a psychology experiment to measure the attention bias between two classes of stimuli, so I require a good saccade/fixation detection and I'm more interested in attention measures over groups of participants, like the time to first fixation, first fixation duration, overall fixation duration, etc.
    Apart from the actual event filters, these do not seem to hard to implement but I was hoping to find ready-to-use solutions both to parse the output and to filter the events...
    I'm formatting the data to meet eyetrackingR input requirements but I'm not sure that it will meet my needs either.

    If you don't mind me asking, what software do you use to parse and analyse your data, @neon? Thank you!

    Best regards,

    Bruno

  • EdwinEdwin Posts: 638

    Hi all,

    Thank you for giving the new code a spin, and huge thanks for your feedback!

    Unfortunately, I'm afraid I won't be able provide much help, as I don't have access to a Tobii tracker. I do have a feeling that the people at Tobii would greatly appreciate your feedback, and they might also be able to help you. My advise would be to give them a shout via GitHub: either through this issue, or by pinging them in a new issue (open a new issue on the PyGaze repository, and them '@'-mention the Tobii developers from the linked issue).

    They're really nice people, and were quite enthusiastic about this project, so I have good hopes they'll be responsive :)

    Cheers,
    Edwin

    Thanked by 2brusil neon
Sign In or Register to comment.