Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Drift Correction of the EyeTribe in Java

edited November 2016 in PyGaze

Hi,

I am using Eyetribe for a project at the moment and have chose java as my main language(Didnt know about pygaze when i started). I was wondering if there was a way to do drift correction in it? Or is there a way that I can use the pythons drift correction procedure in the middle of my experiment when required?
I would be grateful if someone could point me in the right direction.

Thanks

Comments

  • Hi Paddy,

    The EyeTribe definitely supports Java. There's even a special Java SDK.

    However—even though we'd love to help—I'm not sure you will find much help on this forum, because we are mostly Python/ OpenSesame/ R/ JASP centered. In general, Java is not used all that much by scientists (though outside of science it's huge, of course).

    Cheers,
    Sebastiaan

    PS. I changed the title of the discussion so that it's more descriptive.

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

  • Yes, understand that. I should have probably phrased my question differently. I was wondering if someone could explain how the code for drift correction works so I can implement my own version in Java?

  • Ah right. Well, the drift correction for the Eye Tribe is really more of a drift check. We check whether gaze is within a certain distance of a drift target, typically a fixation dot at the display center. If so, the experiment continues; if not, the user gets a chance to try again or recalibrate. That's all.

    If you want to have a true drift correction, you would need to do the same as above, but also store the gaze error: how much gaze deviated from the drift target. And then you subtract this gaze error from all subsequent x,y samples.

    Is that helpful?

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

  • Yeah, that seems to make sense. I was thinking of implementing something on those lines.
    Just to confirm the gaze error part. Gaze error would basically be distance between the actual point and the gaze coordinate.
    So to implement the drift correction, I would basically do something like this:
    For instance, if I have a point on the screen at (400, 400) but the Eyetribe shows the coordinate as (300,350). This means for all the points, I should be adding (100,50) to the points. Correct? Sorry, just making sure as accuracy is very important for the project I am working on.

  • edited December 2016

    So to implement the drift correction, I would basically do something like this: For instance, if I have a point on the screen at (400, 400) but the Eyetribe shows the coordinate as (300,350). This means for all the points, I should be adding (100,50) to the points. Correct?

    Yes, that's correct.

    I should say that opinions are divided about whether drift correction is useful in these kinds of set-ups. I think that drift correction was originally designed by SR Research to deal with their helmet-based eye trackers (the Eyelinks I and II). These helmets tended to slip, leading to a systematic gaze drift in one direction. However, remote eye trackers (like the EyeTribe or Eyelink 1000) don't have such systematic drift, but rather a more-or-less random drift due to the head swinging back and forth a bit. For this reason, the EyeLink 1000 (by default) doesn't do drift correction, just a check.

    Doing correction probably doesn't hurt. But it may not actually be very useful if it's just correcting for random, non-systematic noise.

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

Sign In or Register to comment.