Lost samples with EyeTribe during video viewing
Dear OpenSesame developers,
I have been successfully using an EyeTribe eye-tracker to record gaze data with OpenSesame while viewing a video file. Thank you very much for the great plugins you have developed! They are amazing!
What I am trying to do now is to optimize my experiments as I am losing samples (up to 100 ms between two samples instead of 33 ms or 16 ms - this happened with sampling rates of 30 Hz and 60 Hz).
So what do you think: is it more efficient to use media_player_vlc plug-in or opencv to try not to lose samples?
Another solution could be to show my movie as a sequence of images (it has 2160 images). Does it seem to be a good idea?
Or maybe I am missing something (like a "Prepare" phase or something)?
Thank you in advance for your thoughts!
C.
PS: I updated my Preferences to "Notify me when people comment on my discussions." so that I know when you answer me (and that will prevent me from posting and forgetting about it and not answering after you helped me!!! sorry for not being in touch after you answered my previous post!)