SMI-is it possible o tell apart if the tracker lost the eyes (blinking etc) from looking off screen?
I am tracking saccades to the left or right as responses in my data, and I have found that blinking will trigger an erroneous saccade (the tracker will think an eyetracker.sample violated either the left or right boundary).
I am hoping for a way to identify these responses in real-time in my experiment. I currently have a method that kind of works, but not completely. I have noticed that when the tracker loses contact with the eyes eyetracker.sample will return 0,0. Currently I am trying to identify blink responses in my data by checking for 10ms after each response whether eyetracker.sample ever returns 0,0, and if so then checking whether it continues to return 0,0 for at least 100ms (in which case I conclude they blinked and it erroneously triggered a resopnse). This does identify blinks, but there's a problem.
The problem with the above is that 0,0 is not only returned when the tracker loses the eyes. It is also returned if participants look off screen. As my task requires participants to look left or right, sometimes afterwards they will be too far off screen which also triggers 0,0 from eyetracker sample and thus I will erroneously identify a blink.
Wondering is there any hope in telling apart blinks from looking off screen with pygaze? It seems Iview does know the difference, it says gazepos 0,0 when I'm looking off screen but still shows that my eyes are tracked whereas they disappear when I close them. If I could check whether the tracker totally lost the eyes for 100ms, rather than check if the sample was 0,0, I think I'd be sorted.
Hi guys I found a solution that worked for me! I looked at pupil dilation rather than gazepos. When it is 0, they blinked. I have messed around with it a bit and it seems very good at tracking blinks.
The eyetracker.pupil_size didn't work for me and I wasn't able to define a new method to get pupil diamter for the eyetracker instance with inline scripting for some reason, so I modified the eyetracker.sample function to pick it up (in libsmi.py):
The inline script goes like so right after my trial procedure:
Awesome! Thanks for sharing!