Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Cambridge Research Systems Livetrack AV

edited November 2017 in PyGaze

Hi,

I typically use the old pylink or the most recent pygaze plugin for my eyetracking experiments which always worked fine. Now, for my upcoming fMRI project I will need to use the Cambridge Research Systems Livetrack AV. Judging from the dropdown menu in the pygaze plugin, controlling this particular device is not yet implemented in OpenSesame via a plugin, is that correct? Are there any plans to implement this any time soon?
Related to that, is there anyone who has used Open Sesame to control Cambridge Research Systems Livetrack AV eye-tracker? Sharing any prior experience using this system together with Python (rather than in a more traditional MATLAB+psychtoolbox approach) would also highly appreciated!

Thanks,
michif

PS: Is there any way to put this post in multiple categories (e.g. OpenSesame and Pygaze)?

Comments

  • Hi Michif!
    Did you in the meanwhile have the chance to try the CRS system with opensesame?
    I have a similar question and would be very interested in knowing whether it works... (http://forum.cogsci.nl/index.php?p=/discussion/3849/mangoldvision-eye-tracker-and-crs-livetrack-fm)

    Chiara

  • Hi Michif,

    We currently don't have plans in that direction, because none of us have access to a Cambridge Research Systems Livetrack system. I'd be happy to guide you (or anyone who is interested) through the process of writing PyGaze support for a new eye tracker, if you're interested in programming it yourself.

    Cheers,
    Edwin

  • As it turns out, the CRS system is not really access-friendly for those who do not use their viewer or MATLAB.
    We figured out one option to at least get the timings aligned. You can connect your stimulus PC via a parallel port/coaxial connection to the coaxial input on the eye-tracker. That way you can send manually binary trigger events via Python. Alternatively what you might find is that the MRI is already hooked up to the coaxial input of the eye tracker. In that case, the scanner triggers are synched with the eye tracker. As long as you log the timings of the MRI pulses in OpenSesame, it should be no problem to track what happened when.
    I haven't found a way to make OpenSesame experiments gaze-contingent with that eye tracker. From what I have understood, that would require you to program your own API to interact with the HID interface (and which is quite some work).

    Thanked by 1Edwin
Sign In or Register to comment.