Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

[solved] playing video while colllecting data with PyGaze

edited June 2016 in PyGaze

Hi, \n
I'm using PsychoPy and pygaze to control EyeLink eyetracker.
while in the examples and documentation it shows how to track the gaze while presenting an imgae or text on the screen I'm wondering how I can track the gaze while playing video file (.avi).
I used PsychoPy GUI to build a simple experiment playing a video then added a snippet handling the pygaze part.
It shows a grey screen instead of the movie (but records the eye movements).
I think that the problem is in eyetracker.EyeTracker(libscreen.Screen). Is there a way to tell PyGaze to not show anything on the screen?

Thanks,
Eshed

Comments

  • edited June 2016

    Hi Eshed,

    The PsychoPy API (not the builder, but the underlying software library) has an option to present videos: the MovieStim object. In essence, PyGaze is a relatively simple wrapper around PsychoPy, when you set the DISPTYPE to 'psychopy'. So you could attempt the following:

    import pygaze
    from pygaze import libtime
    from pygaze.display import Display
    from pygaze.screen import Screen
    from pygaze.eyetracker import EyeTracker
    
    from psychopy.visual import MovieStim
    
    # Initialise a Display instance, using the default settings.
    # (You can add a constants.py file to set these defaults.)
    disp = Display()
    
    # Get the handle to the active PsychoPy Window instance.
    win = pygaze.expdisplay
    
    # Initialise and calibrate an EyeTracker instance
    tracker = EyeTracker(disp)
    tracker.calibrate()
    
    # Initialise a PsychoPy MovieStim
    mov = MovieStim(win, 'testMovie.mp4', flipVert=False)
    
    # Add the MovieStim to a PyGaze Screen instance.
    # (The Screen object has a list of all its associated
    # PsychoPy stimulus instances; you can add custom
    # instances, like the MovieStim, and they will automatically
    # be drawn each time you fill and show the Display.)
    movscr = Screen()
    movscr.screen.append(mov)
    
    # Start recording from the eye tracker.
    tracker.start_recording()
    
    # Record the starting time, and log it to the tracker.
    t1 = libtime.get_time()
    tracker.log('START; time=%d' % (t1))
    t0 = libtime.get_time()
    
    # Run for the duration of the video
    while t1 - t0 < mov.duration * 1000:
    
        # Fill the Display with the Screen. The right
        # frame from the video will automatically
        # be selected by PsychoPy.
        disp.fill(movscr)
        # Show the Display, and record its onset
        t1 = disp.show()
        # Log the screen flip.
        tracker.log('FLIP; time=%d' % (t1))
    
    # Stop recording from the eye tracker.
    tracker.log('END; time=%d' % (t1))
    tracker.stop_recording()
    
    # Close the connection to the tracker.
    tracker.close()
    
    # Close the Display.
    disp.close()
    

    Cheers,

    Edwin

  • edited June 2016

    Thanks for the answer.
    I had few problems I encountered when I tried to run your code.

    1. I had to manually update the pygaze package in the psychopy folder. Is that ok?

    2. In my code (line 56). I changed disp.fill(scr) to disp.fill(movscr) since scr was not exist.
      is that correct?

    3. After changing your code I tried to run it in dummy mode and it worked well. but when I changed it to eyelink it stuck on the calibration menu (after calibrating). How can I continue to the actual experiment?

    4. What is the best way to write log of conintous eye movement? using liblog?

    Thanks for you help,

    Eshed

    import pygaze
    from pygaze import libtime
    from pygaze.display import Display
    from pygaze.screen import Screen
    from pygaze.eyetracker import EyeTracker
    from psychopy.visual import MovieStim
    
    from psychopy import visual, core, event
    from pygaze import libinput
    from pygaze import liblog
    
    # Initialise a Display instance, using the default settings.
    # (You can add a constants.py file to set these defaults.)
    disp = Display()
    
    # Get the handle to the active PsychoPy Window instance.
    win = pygaze.expdisplay
    
    # create logfile object
    log = liblog.Logfile()
    log.write(["trialnr", "trialtype", "endpos", "latency", "correct"])
    trialnr = 1
    trialtype =1
    
    
    # Initialise and calibrate an EyeTracker instance
    tracker = EyeTracker(disp , trackertype='eyelink' )
    keyboard = libinput.Keyboard(keylist=['space'], timeout=None)
    tracker.calibrate()
    
    # Initialise a PsychoPy MovieStim
    mov = MovieStim(win, 'jwpIntro.mov', flipVert=False)
    
    # Add the MovieStim to a PyGaze Screen instance.
    # (The Screen object has a list of all its associated
    # PsychoPy stimulus instances; you can add custom
    # instances, like the MovieStim, and they will automatically
    # be drawn each time you fill and show the Display.)
    movscr = Screen()
    movscr.screen.append(mov)
    
    # Start recording from the eye tracker.
    tracker.start_recording()
    
    # Record the starting time, and log it to the tracker.
    t1 = libtime.get_time()
    tracker.log('START; time=%d' % (t1))
    t0 = libtime.get_time()
    
    # Run for the duration of the video
    while mov.status != visual.FINISHED:
    
        # Fill the Display with the Screen. The right
        # frame from the video will automatically
        # be selected by PsychoPy.
        disp.fill(movscr)
        # Show the Display, and record its onset
        t1 = disp.show()
        # Log the screen flip.
        tracker.log('FLIP; time=%d' % (t1))
        if event.getKeys(keyList=['escape','q']):
            win.close()
            core.quit()
    
    # Stop recording from the eye tracker.
    tracker.log('END; time=%d' % (t1))
    tracker.stop_recording()
    
    # Close the connection to the tracker.
    tracker.close()
    
    # Close the Display.
    disp.close()
    
    
  • edited 2:37PM

    Dear Eshed,

    1) Yeah, that sounds fine.

    2) Woops, sorry, my bad! Thanks for letting me know. I've fixed it in the original post now. Happy it worked after that minor change, though! :)

    3) When in the EyeLink menu, press the Q button on the stimulus PC to return to the experiment. Alternatively, you can also press the Escape button on the EyeLink PC. (On a side note: Directly after calibration and validation, sometimes you return to a black screen instead of the actual menu. That's an issue with SR Research' underlying pylink toolbox. If the screen is black, but you expected the menu, hit Enter to bring up the eye image, and then hit Enter again to bring back the menu.)

    4) You don't have to do this by hand, the EyeLink class will take care of it. Once you call the start_recording method, data will continuously be logged to an EDF file. You can pause recording by calling the stop_recording method. This is illustrated in the example above.

    Cheers,

    Edwin

  • edited 2:37PM

    Hi,

    Few more problems that still occcur.

    1. when I use you while loop (while t1-t0<mov.duration) and while using trackertype='dummy'or trackertype='eyelink' under EyeTracker. It stops showing the movie after less than a second and raise the error: Dummy instance has no attribute 'close_recording' or libeyelink instance has no attribute close_rrecording.

    2. when trying to run your code I can use only the Eyelink's keyboard and not the experiment computer. How can I solve this issue?

    3. after calibrating and validating pressing on Q on either of the keyboards does nothing. ESC on the Eyelink keyboard send me to noise calibration screen. after I'm pressing start the movie start to play for less than a second becuase of the attribute error.

    Thanks again for the help,

    Eshed

  • edited 2:37PM

    That was really sloppy of me, sorry. The correct function to call is stop_recording. I've updated the scripts above. That should solve points 1 and 3.

    As for 2: You should be able to also use the experiment computer's keyboard for everything, from calibrating the EyeLink to providing input. What exactly is it that you can't use it for, and what kind of setup do you have? Is it a standard USB-connected keyboard?

    Cheers,

    Edwin

  • edited 2:37PM

    Hi Edwin,
    Sorry to bother you so much with this subject but I still have some bugs.

    1. When running the experiment now I do get EDF file but I see the first frame of the video for few seconds and then it quits out of the experiment (saving the EDF file with about 0.3 seconds of data).

    2. When I'm using my while loop which is different than yours it works fine. Is there a reason to insist to do it in your way (code is attached below)?

    3. I added inside the while loop an if statment to be able to terminate the experiment in a non-violent way. Is it the right way to do it?

    4. I can start the calibration only by using the Eye Link computer keyboard. it is connected with USB to the Eye link computer. The experiment computer keyboard is a wireless keyboard.

    Thanks again,
    Eshed

    import pygaze
    from pygaze import libtime
    from pygaze.display import Display
    from pygaze.screen import Screen
    from pygaze.eyetracker import EyeTracker
    
    from psychopy.visual import MovieStim
    
    from psychopy import visual, core, event
    from pygaze import libinput
    from pygaze import liblog
    
    # Initialise a Display instance, using the default settings.
    # (You can add a constants.py file to set these defaults.)
    disp = Display()
    
    # Get the handle to the active PsychoPy Window instance.
    win = pygaze.expdisplay
    
    # Initialise and calibrate an EyeTracker instance
    tracker = EyeTracker(disp,trackertype='eyelink')
    tracker.calibrate()
    
    # Initialise a PsychoPy MovieStim
    mov = MovieStim(win, 'SampleVideo_360x240_50mb.mp4', flipVert=False)
    
    # Add the MovieStim to a PyGaze Screen instance.
    # (The Screen object has a list of all its associated
    # PsychoPy stimulus instances; you can add custom
    # instances, like the MovieStim, and they will automatically
    # be drawn each time you fill and show the Display.)
    movscr = Screen()
    movscr.screen.append(mov)
    
    # Start recording from the eye tracker.
    tracker.start_recording()
    
    # Record the starting time, and log it to the tracker.
    t1 = libtime.get_time()
    tracker.log('START; time=%d' % (t1))
    t0 = libtime.get_time()
    
    # Run for the duration of the video
    while mov.status != visual.FINISHED:
    
        # Fill the Display with the Screen. The right
        # frame from the video will automatically
        # be selected by PsychoPy.
        disp.fill(movscr)
        # Show the Display, and record its onset
        t1 = disp.show()
        # Log the screen flip.
        tracker.log('FLIP; time=%d' % (t1))
        if event.getKeys(keyList=['escape','q']):
            break
    
    # Stop recording from the eye tracker.
    tracker.log('END; time=%d' % (t1))
    tracker.stop_recording()
    
    # Close the connection to the tracker.
    tracker.close()
    
    # Close the Display.
    disp.close
    
    if event.getKeys(keyList=['escape','q']):
        win.close()
        core.quit()
    
  • edited June 2016

    Hi Eshed,

    1 and 2) No reason to prefer one over the other. I like your implementation! The reason that mine ended too early, is because I forgot to multiply the video's duration by 1000. It only just now occurred to me that PsychoPy uses seconds rather than milliseconds. (I'm not the cleverest cookie, clearly.)

    3) This is definitely one way to do it. I would prefer using a PyGaze implementation, but that's just to keep the code a bit cleaner. There is no difference in functionality. I would, however, change the final lines of your snippet above. It should be disp.close() (note the round brackets!), and there shouldn't be an if statement (disp.close already closes the active Window; you don't have to do that).

    4) That's a really weird bug, and I'm not sure what causes that. Maybe your keyboard is in or set to a different language? Sorry, no clue.

    Cheers,

    Edwin

  • edited 2:37PM

    Thank you very much Ediwn!

    Everything works now!

    Eshed

  • edited 2:37PM

    Hi again,

    Just wanted to let you know that my while loop doesn't work with all the video formats.

    I guess it is related to movieStim. so you loop is better for this kind of cases.

    Eshed

  • Hi Edwin,
    I hope you are doing well in Cambridge and congratulations on your viva. I intend to run a similar experiment with Movie Stim. It is my first time to use PsyGaze. So, I hope my question is not too stupid. In the experiment, participants view short videos (1 sec) in which either an emotion or the gender in the face morphs. Participants need to pay attention and as soon as they detect a change in emotion or gender in the face click on the bar with mouse. Once they clicked the movie stops. We record the pupil size, the RT & ACC. For the emotion trials, at one end of the bar we show a happy face, and at the other end we show the angry face, we can place the neutral face at the middle. Similar for the gender morph trials. Any idea how to go around this?
    Your previous posts are very useful for playing the movie. But, I don't know where and how to include the rest of the experiment in the code and how to record the outputs. Any help would be greatly appreciated. Cheers,
    Zargol

  • Hi Zargol,

    Thanks! Doing rather well here; really enjoying the new position and the town! Hope you're doing well too?

    As for your question: This is totally possible, but would require some additional scripting within the above example. Specifically, it is likely you'd want to do something like the following:

    import pygaze
    from pygaze import libtime
    from pygaze.display import Display
    from pygaze.screen import Screen
    from pygaze.eyetracker import EyeTracker
    # # # # #
    # THIS IS WHERE THE CODE IS DIFFERENT FROM THE OTHER EXAMPLE!
    from pygaze.mouse import Mouse
    # # # # #
    
    from psychopy.visual import MovieStim
    
    # Initialise a Display instance, using the default settings.
    # (You can add a constants.py file to set these defaults.)
    disp = Display()
    
    # # # # #
    # THIS IS WHERE THE CODE IS DIFFERENT FROM THE OTHER EXAMPLE!
    # Initialise a new Mouse instance.
    my_mouse = Mouse()
    # # # # #
    
    # Get the handle to the active PsychoPy Window instance.
    win = pygaze.expdisplay
    
    # Initialise and calibrate an EyeTracker instance
    tracker = EyeTracker(disp)
    tracker.calibrate()
    
    # Initialise a PsychoPy MovieStim
    mov = MovieStim(win, 'testMovie.mp4', flipVert=False)
    
    # Add the MovieStim to a PyGaze Screen instance.
    # (The Screen object has a list of all its associated
    # PsychoPy stimulus instances; you can add custom
    # instances, like the MovieStim, and they will automatically
    # be drawn each time you fill and show the Display.)
    movscr = Screen()
    movscr.screen.append(mov)
    
    # # # # #
    # THIS IS WHERE THE CODE IS DIFFERENT FROM THE OTHER EXAMPLE!
    # Define where you would like to draw the bar.
    # These values are relative to the display size, so that it works
    # across different computers.
    bar_x = int(DISPSIZE[0]*0.1)
    bar_y = int(DISPSIZE[0]*0.85)
    # Define how wide and high the bar should be.
    # These values are again relative to the display size.
    bar_w = int(DISPSIZE[0]*0.8)
    bar_h = int(DISPSIZE[1]*0.05)
    # Draw an empty bar at the bottom of the screen.
    movscr.draw_rect(x=bar_x-1, y=bar_y-1, w=bar_w+2, h=bar_h+2, pw=3, fill=False)
    # Draw three faces.
    # We're assuming that the faces are PNG files in the same folder,
    # and that they are 50 pixels wide and high.
    movscr.draw_image("happy.png", pos=(bar_x, bar_y-50))
    movscr.draw_image("neutral.png", pos=(bar_x+bar_w//2, bar_y-50))
    movscr.draw_image("angry.png", pos=(bar_x+bar_w, bar_y-50))
    # Now we'll draw the final bar, which will indicate the fill level of the
    # outer bar. In the code below we will make sure it's responsive to
    # mouse movements, but for now we'll just draw it at a central position.
    movscr.draw_rect(colour=(255,0,0), x=bar_x, y=bar_y, w=bar_w//2, h=bar_h, pw=1, fill=True)
    # In the background, a PsychoPy Rect was created as a part of the
    # PyGaze Screen 'movscr'. We need to know the index of that Rect
    # within the Screen's inner list of PsychoPy stimuli.
    bar_rect_index = len(movscr.screen) - 1
    # # # # #
    
    
    # Start recording from the eye tracker.
    tracker.start_recording()
    
    # Record the starting time, and log it to the tracker.
    t1 = libtime.get_time()
    tracker.log('START; time=%d' % (t1))
    t0 = libtime.get_time()
    
    # Run for the duration of the video
    while t1 - t0 < mov.duration * 1000:
    
        # # # # #
        # THIS IS WHERE THE CODE IS DIFFERENT FROM THE OTHER EXAMPLE!
        # Get the current mouse position.
        my_mouse.get_pos()
        # Update the bar width according to the
        # current (horizontal) mouse position.
        if pos[0] < bar_x:
            bar_fill_w = 1
        elif pos[0] > bar_x + bar_w:
            bar_fill = bar_w
        else:
            bar_fill = bar_w - (pos[0] - bar_x)
        # Adjust the bar's width by directly accessing the
        # PsychoPy Rect instance.
        movscr.screen[bar_rect_index].w = bar_fill_w
        # # # # #
    
        # Fill the Display with the Screen. The right
        # frame from the video will automatically
        # be selected by PsychoPy.
        disp.fill(movscr)
    
        # Show the Display, and record its onset
        t1 = disp.show()
        # Log the screen flip.
        tracker.log('FLIP; time=%d' % (t1))
    
    # Stop recording from the eye tracker.
    tracker.log('END; time=%d' % (t1))
    tracker.stop_recording()
    
    # Close the connection to the tracker.
    tracker.close()
    
    # Close the Display.
    disp.close()
    
Sign In or Register to comment.

agen judi bola , sportbook, casino, togel, number game, singapore, tangkas, basket, slot, poker, dominoqq, agen bola. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 50.000 ,- bonus cashback hingga 10% , diskon togel hingga 66% bisa bermain di android dan IOS kapanpun dan dimana pun. poker , bandarq , aduq, domino qq , dominobet. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 10.000 ,- bonus turnover 0.5% dan bonus referral 20%. Bonus - bonus yang dihadirkan bisa terbilang cukup tinggi dan memuaskan, anda hanya perlu memasang pada situs yang memberikan bursa pasaran terbaik yaitu http://45.77.173.118/ Bola168. Situs penyedia segala jenis permainan poker online kini semakin banyak ditemukan di Internet, salah satunya TahunQQ merupakan situs Agen Judi Domino66 Dan BandarQ Terpercaya yang mampu memberikan banyak provit bagi bettornya. Permainan Yang Di Sediakan Dewi365 Juga sangat banyak Dan menarik dan Peluang untuk memenangkan Taruhan Judi online ini juga sangat mudah . Mainkan Segera Taruhan Sportbook anda bersama Agen Judi Bola Bersama Dewi365 Kemenangan Anda Berapa pun akan Terbayarkan. Tersedia 9 macam permainan seru yang bisa kamu mainkan hanya di dalam 1 ID saja. Permainan seru yang tersedia seperti Poker, Domino QQ Dan juga BandarQ Online. Semuanya tersedia lengkap hanya di ABGQQ. Situs ABGQQ sangat mudah dimenangkan, kamu juga akan mendapatkan mega bonus dan setiap pemain berhak mendapatkan cashback mingguan. ABGQQ juga telah diakui sebagai Bandar Domino Online yang menjamin sistem FAIR PLAY disetiap permainan yang bisa dimainkan dengan deposit minimal hanya Rp.25.000. DEWI365 adalah Bandar Judi Bola Terpercaya & resmi dan terpercaya di indonesia. Situs judi bola ini menyediakan fasilitas bagi anda untuk dapat bermain memainkan permainan judi bola. Didalam situs ini memiliki berbagai permainan taruhan bola terlengkap seperti Sbobet, yang membuat DEWI365 menjadi situs judi bola terbaik dan terpercaya di Indonesia. Tentunya sebagai situs yang bertugas sebagai Bandar Poker Online pastinya akan berusaha untuk menjaga semua informasi dan keamanan yang terdapat di POKERQQ13. Kotakqq adalah situs Judi Poker Online Terpercayayang menyediakan 9 jenis permainan sakong online, dominoqq, domino99, bandarq, bandar ceme, aduq, poker online, bandar poker, balak66, perang baccarat, dan capsa susun. Dengan minimal deposit withdraw 15.000 Anda sudah bisa memainkan semua permaina pkv games di situs kami. Jackpot besar,Win rate tinggi, Fair play, PKV Games