Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Weird pygaze calibration reports (constant distance value & absurdly high acc values)


My pygaze calibrations reports always turn out quite weird. For one thing, the screen distance is ALWAYS 57cm, no matter which eye-tracking device I'm using. I've tried to find out why by inspecting the according files, but to no avail. The other issue is the accuracy, which is always around 700px horizontally and 500px vertically, no matter how well the device was calibrated. In the end, the calibration report doesn't matter all that much in my experiment specifically since I've created a custom validation by which I judge the need for a recalibration. However, I would still like to find out why the results end up like this since it'd be more convenient to report the pygaze calibration results in an article rather than having to explain my custom validation :)

Can you give me a tip where to look for potential culprits? I think the coordinate settings may be to blame, but that's just a hunch.

Thanks in advance!


  • Dear @Edwin

    I noticed the same problem as @ChrIm pointed out. For example, this is a calibration report on the Tobii T60:

    While I'm pretty sure that I was following the targets closely, the accuracy report seems to show otherwise. It is interesting that those values were so close to the center of the screen. I was wondering if the reports did not show the absolute deviation from the targets but some kind of average position of samples?



  • Hi everyone,

    I had the same problem using Tobii Spectrum, any advice would be helpful.



  • edited June 13

    Hi all,

    Screen distance can only be picked up by systems that can. If yours can't, it will try to fall back on a user-defined value, or (in lieu of that) on the default of 57 cm. To define your screen distance, obviously measure it first , and then add this to your script:

    # Set the screen-eye distance to 60.5 cm
    SCREENDIST = 60.5

    The same holds true for the screen size, which should be user-defined. For example, if it's 40.3x30.0 cm, include this in your

    SCREENSIZE = (40.3, 30.0)

    High values for "accuracy (in pixesl)" mean that your eye tracker thought you were looking at a different place on the screen than where the central marker was presented. One very common reason for this is that people don't set the correct display resolution. For example, if your display is 1024x768 pixels, but you set 800x600, the tracker will report in the 1024x768 frame whereas your experiment will display in the 800x600 frame. This will lead to mismatches. (The only exception to this when you use a different resolution to your display's, is when you set that different resolution in both the eye tracking software and your experimental software.) To set your resolution correctly, include the following in (obviously adjusted to your own resolution):

    # Set the resolution to 1024x768
    DISPSIZE = (1024, 768)



    PS. VERY IMPORTANT: The values for the speed and acceleration thresholds are ONLY used when event detection is set to "pygaze". Hence, you will not notice any of these values being off if you don't use the "wait_for_*" event functions. Even if you do use them, you'll only be affected by these values being off if you're using the "pygaze" event detection as opposed to "native".

    Thanked by 1hanzh
  • Hi @Edwin ,

    Thank you for your reply! I have some follow-up questions. I'm now using the PyGaze plug-in in OpenSesame with EyeLink.

    1. I assume that DISPSIZE is automatically set as the experiment's resolution and I don't have to set it explicitly?
    2. Do any of the PyGaze settings (DISPSIZE, SCREENSIZE, SCREENDIST) override settings in the PHYSICAL.INI in the host PC? I have set them correctly in the .INI file but I'm not sure if they will be changed if I don't set them correctly in opensesame.
    3. Finally, if my screen is 1920*1280 but I run experiments in 1024*768, which one should be the correct DISPSIZE?

    Thanks a lot for your help!


  • Perhaps @sebastiaan also knows answers to those questions above? :)

    1. Yes, the resolution you set in OpenSesame is passed as DISPSIZE to PyGaze.
    2. Nothing you set through OpenSesame or PyGaze changes the INI file, but they will be used as settings in PyGaze's computations. The INI file is not used for this.
    3. You should run your experiment in a 19020x1080 resolution if that is the resolution of your monitor. Alternatively, you could use a monitor with a 1024x768 resolution if you really mean to use 1024x768. (I don't really understand why you wouldn't use the native resolution?)
  • Hi there,

    I'm using Tobii TX300, and apparently my Distance value in the log file is not getting updated by the calibration procedure. What should I do now?


  • Just to clarify: calibration procedure for Tobii seems to detect the visual distance correctly, it's just that the calibration value in the eyetracker log file is not getting saved accordingly.

  • Hi Art,

    If you're not using any of the implemented online event detection functions, you don't have to do anything.

    If you'd like for Tobii to implement the distance sensing, you could file a request on GitHub: (tag the Tobii devs on GitHub if you file an issue there: @grebdems and @pedrotari7)

  • Hi Edwin,

    Thanks for the reply. I am confused: during calibration (the initial step, where Tobii positions subject's eyes in its bounding box), distance is estimated correctly. This value is simply not saved by pygaze in its log file. I wonder how to retrieve it and have it stored in the log file. I thought this is all done by pygaze?

  • Hi Art,

    The question is whether it is saved and used later on. Examples:

    My SMI implementation in PyGaze estimates distance using the tracker, and later saves this to the log file, and uses it in event detection:

    The Tobii implementation takes the default value (, and it doesn't look like this is updated before being logged and used ( This is also true in my earlier Tobii implementation (tobii-legacy).

    I referred you to GitHub, because Tobii developers maintain the Tobii implementation in PyGaze. They would be much quicker to implement the functionality you're requesting. (Also, I don't have access to a Tobii, so wouldn't be able to test any changes I make.)



  • OK, I think we managed to dig into the appropriate portions of Pygaze code.

    Just one question: is there any way of making Pygaze detect saccades for Tobii output? I can't see how to enable it in Pygaze settings.

  • Bump.

    I dug in code in the meantime, but still can not figure out how to make it log saccades as events in the Tobii log file. I mean, the saccade detection code in libtobii seems OK, I have eventdetection set to pygaze version, but it seems to not work. I am not a Python expert so can not really find the reason for this and check whether it detects saccades on-line and just not saves them or whether not saving results from saccades not being detected.

    I was also wondering, if there is any low-pass filtering implemented to the gaze data, as I see nothing like this in the code, apart from noise calibration parameters being applied to saccade detection/thresholds etc.

  • As I said before, this stuff is really only for online event detection. I don't really see why you'd like to continuously try to detect events and log them to the data file. You can use much better event detection algorithms offline using the gaze samples.

    The same is true for filtering. Why would you want to do this during recording? While sensible for some analyses, ideally your raw data is as raw as possible, leaving you to do whatever you need offline.

  • Right. Is there any code available for analysing saccade kinematics for Tobii output with Opensesame?

    Regarding filtering, I was basically wondering if what Tobii SDK outputs is already filtered but this seems to be not the case.

    But anyway, I also thought of using some on-line processing, for fixation control - do you know if there is anything available in these lines? I mean, code that could be implemented with Opensesame?

Sign In or Register to comment.