Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Custom made eye-tracker in OpenSesame

edited July 2016 in PyGaze

Hello there,

I would like to use OpenSesame to control my simultaneous paradigm with EEG and eye-tracker. Our lab bought a custom made eye-tracking device with proprietary software which was included. Unfortunately it doesn't allow us to use it together (there is no chance to sync EEG and eye-tracker events).

Is there any possibility that can be used to my paradigm OpenSesame?

Thank you for you time

Comments

  • edited 3:04AM

    Hi Michael,

    Couple of questions to get the full picture:

    1) What eye tracker did your lab buy?

    2a) What software does it use?

    2b) Are there Python bindings for the software mentioned at 2a?

    2c) If the answer to 2b was "No", then is there an alternative option? For example, there could be a generic API, or a DLL with functions to communicate with the tracker.

    3) How much programming experience do you have in general, and how much in Python?

    4a) What kind of EEG equipment are we talking about?

    4b) What kind of acquisition software do you use with your EEG system?

    4c) How do you communicate with the EEG acquisition software?

    5) What do you need from the linked gaze and EEG? Ideally, we would like the answers to the following questions:

    • Would you like to co-register events to both log files?
    • Do you need to log eye events (e.g. fixations, blinks, and saccades) to the EEG log?
    • Do you need the experiment to be gaze contingent?
    • Do you need the EEG to be streamable, e.g. for biofeedback? Or are you fine with just sending triggers to the EEG log file?

    Cheers,

    Edwin

  • edited 3:04AM

    Hi Edwin, thank you for reply.

    1) A unique fully working prototy of custom made eye-tracker at our technical faculty
    2a) Some VB.net proprietary SW (But I managed to get video from tracker in I guess PyGaze or something like that - since then I had no time to try it again)
    3) I am a developer of EEG paradigms in E-Prime, Presentation, OpenSesame and also coding toolbox for EEGLAB. With Python - a little, with Python for eye-tracking - none.
    4a) Biosemi ActiveTwo witch 64 active channels
    4b) ActiveView (based on LabView)
    4c) Parallel port
    5) It would be great to have co-registered events in eeg (from stimuli and tracker). For me it would be very usefull to put an trigger to eeg if participant was too long with closed eyes for me to know that trial I should remove in analysis because he/she was not watching. Eye-events for sure. Don't need a tracker to be streamable...I am fine to send trigger via LPT same as EEG does to have all the info in recording.

    Cheers,

    Michael

  • edited 3:04AM

    Thanks for clearing some of that up.

    1) Check. Sounds like a cool project!

    2) You guess you got video from it, but you're not sure what package you used to do it? I don't quite follow. Also, getting the video isn't very helpful if the device can already do eye tracking. More importantly: How do you access the VB.NET software? Is it a server that's running in the background, or is it nothing more than an API that you can use in your own applications in VB? In other words: How do you communicate with the tracker?

    3) Python might be worth investing some time in, as this project will likely require scripting a new Python library.

    4) That's great! There's support for parallel port communications through plug-ins and/or inline scripting. See here for more info.

    5) Ah, that's good, as it reduces complexity. For every event in the experiment, you'd need to send a simultaneous trigger to the data files for gaze and EEG. That's simple enough, and only requires you to know how to record a message in the eye tracker's log file.

    For registering eye events, you'll also need to monitor the gaze data online, and send a trigger to the EEG file whenever you find something. That's more tricky, as it would require you to stream and process data from the tracker (unless it already has a built-in event detection that you could use).

    To answer your initial question: It does seem technically possible to use this hardware, provided you can actually interface with the eye tracker through Python. This is likely to involve quite a bit of work, depending on the complexity of the tracker's software. I imagine you'd have to build an application in Visual Basic that handles data processing. In Python you could use a socket connection to communicate with your VB application, for example to let it know that it should start/stop recording data, or to get the newest sample. Now that the data is available in Python, you should set up a separate Thread or Process to stream the data, and analyse whether an event is happening (for examples of online event detection algorithms, see this paper).

    Sounds like a fun, but challenging project. Good luck!

    Edwin

  • Edwin, thanks for reply. Ad 2) Tracker use USB 3.0 to communicate with VB.NET app written by the authors of tracker. Ad 3) I hope to hack it other way :smile:, ad 4) I know :smile:, ad 5) Exactly.

    Is it possible to communicate with other application by OpenSesame? I know that E-Prime can open any app you have installed on your computer and let you send keyboard shortcut. I came in my mind that I would be much more easier to open an app by code at the background and send F5 (or whatever) to start recording eyetracker and send trigger to EEG to start recording. Then it won't get me so much efford to synch those two. I am right?

  • edited July 2016

    You can open a connection with any server using the socket module (native Python; you can import it in an inline_script). From a programming perspective, that's the cleanest way to do this. See here for an explanation on how to operate it.

    As for the hack you describe, this thread on Stack Overflow describes different ways of simulating key presses from Python. Opening an application is possible by simply placing a system call. See the second answer on this thread on Stack Overflow.

Sign In or Register to comment.