Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Pygaze webcam eyetracker for mouse control

Is there a way I can use the coordinates found from the pygaze webcam eye tracker for controlling the mouse. I believe I will need to find the gaze for this. But I am not sure how to achieve that from the pupil and glint coordinates.

Comments

  • Hi,

    Do you already have a working eye tracking set up? That is, are you able to track eye and produce data incl. gaze coordinates? If so, you can use exp.pygaze_eyetracker.sample() to obtain the current coordinates and use them to control things on your screen.

    I don't think you can directly control the mouse, but you should be able to "make" your own mouse and move that one around.

    Good luck,

    Eduard

    Buy Me A Coffee

  • Hey,

    The eye tracking set up I have right now finds the pupil and glint coordinates of the recorded frame. So these coordinates actually refer to the location of the pupil in the cropped image of the eyes. When I simply translated these coordinates into the screen coordinates, they were very inaccurate and were mostly in the center.

    I was wondering what would be the best way to get these coordinates from webcam eye tracking? I already have the mouse pointer movement set up. I just need to input the coordinates for every frame.

    I don't believe the EyeTracker.sample function works with the camera frames. Correct me if I am wrong and let me know if you have suggestions for the finding the gaze point on the screen


    Here is the reference for pygaze webcam eye tracker: https://www.pygaze.org/2015/06/webcam-eye-tracker/

  • Hey,

    When I simply translated these coordinates into the screen coordinates, they were very inaccurate and were mostly in the center.

    do you have calibration data? It could be a scaling issue. The glint won't move much in total, much less than the corresponding change on the display. So you need to apply some transformation from eye coordinates to screen coordinates. A calibration is essential in that process.

    I don't know how you interact with the webcam to get the samples. If you don't use pygaze to create an Eyetracker object, then .sample() won't work. Can you share your method of recording pupil and glint?

    Eduard

    Buy Me A Coffee

  • The webcam-eyetracker is a separate project, and a toy one at that. You can find it here: https://github.com/esdalmaijer/webcam-eyetracker

    As indicated, you'd need to implement a proper calibration to translate pupil+gaze in the eye image to on-screen coordinates. Usually, this would entail doing the following:

    1) Show stimuli on the screen that you're trying to calibrate for. These are usually dots. You present them on known locations.

    2) While you present each dot, you record pupil and glint coordinates.

    3) Pupil - glint coordinate should (roughly) linearly relate to on-screen coordinates, so you could simply regress one on the other for each individual.

Sign In or Register to comment.