Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

EyeGaze contingencies

edited March 2021 in PyGaze

Hello, I am a masters student and I would like to create a gaze-contingent attention bias modification task using PyGaze and OpenSesame.

The display consists of two word stimuli (one correct and one incorrect) displayed in two rectangles.

I would like the rectangles’ outline to turn red when the participant looks at the incorrect word, and green when the participant looks at the correct word.

Also, I would like to play an ‘incorrect’ sound when the participant is not looking at any of the stimulus rectangles, and a ‘correct’ sound when the participant fixates on the correct word for a given amount of time. 

Trials should begin with a gaze-contingent fixation dot, where the participant has to fixate on the dot for a given amount of time before resuming the trial. The documentation shows how to create this fixation dot, but I do not know how to set this required fixation duration.

I am also wondering about how to set up the parameters for the drag and drop eyetracker functions.

As you probably guessed, I am an absolute beginner. I learned OpenSesame basics by watching the video tutorials on the OpenSesame website, but I could not find any eyetracking videos. The PyGaze documentation is helpful, but not nearly enough at my level…

Are there any online resources to learn how to use PyGaze and OpenSesame ‘from scratch’?

Comments

  • Hi,

    Are there any online resources to learn how to use PyGaze and OpenSesame ‘from scratch’?

    Not that I am aware of. You can try to browse online repositories like OSF for projects that used pygaze with Opensesame

    I am also wondering about how to set up the parameters for the drag and drop eyetracker functions.

    Essentially, start recording before a trial/block and end record after a trial/block by drag/drop respective item to the position in the script. You can also also record the entire experiment. In any case, you need to make sure that you know when things happen in your experiment. So probably, you will need to send custom log messages to be able to sync experimental events with eye events.

    Hello, I am a masters student and I would like to create a gaze-contingent attention bias modification task using PyGaze and OpenSesame.

    The key function you are looking for is the exp.pygaze.eyetracker.sample() function, which gives you the eye coordinates every time you call it. The idea is then to write a little loop that tracks eye position during a trial, detects whether a predefined event happens, and adjust experimental settings according to it.

    Here is a link to one of my own projects where I did essentially that. It is a little convoluted (because of reasons), but if you work through the script, you mean even learn something about Opensesame/Pygaze ;)

    https://osf.io/349xz/

    Feel free to ask if you can't figure out what I have done.

    Eduard

    Buy Me A Coffee

  • Thank you M. Ort for your kind reply and impressive code :)

    I was able to use some of it to program almost the entire task (I think). I am now stuck on the last gaze contingency where the participant has to fixate on one of the stimuli to proceed to the next trial. I used the drag and drop items, and so I am wondering how to send the command to OpenSesame to clear the current display and proceed to the next trial.

    Do you have any idea how I can code this?

  • Hi @mrhmatar ,


    Could you upload your experiment here?


    Cheers,


    Lotje

    Did you like my answer? Feel free to Buy Me A Coffee :)

  • Hi Lotje,

    Here is the latest version of it. It is not working yet...

    Best,

    Mariah

  • Hi Mariah,

    What exactly do you mean with:

    I used the drag and drop items, and so I am wondering how to send the command to OpenSesame to clear the current display and proceed to the next trial.

    Literally, how to stop a trial and change the screen to black?

    You can show an empty screen by calling the clear function on a canvas object and then showing it.

    my_canvas.clear()
    my_canvas.show()
    

    Normally, you shouldn't need this though. Because of the block/trial loop structure, the end of one trial is the start of the next trial. So, once the last code in your trial sequence is executed, you should start with the first code in that sequence. If this does not happen, you probably have created an infinite loop in your python code, meaning that you created a situation, in which it is impossible for a user to proceed in the experiment. This can happen if you have some condition in your code that says: "Move to next trial, if fixation is at [this] location", where [this] is some location, that is impossible to be fixated (e.g. outside the screen).

    I haven't testrun your code in detail, but I can see two problems:

    1)

    if dist(xy, var.success_pos) == 0:

    This will cause a problem because the distance is never exactly equal to zero. There will always be some deviation, so best to define a small epsilon that will work as upper bound of values that are treated as being essentially zero. For example:

    if dist(xy, var.success_pos) < 50:


    The further away your stimuli are from each other, the more liberal you can be here.

    2)

    while fixation_duration > 100:

    This loop is executed for as long as the fixation duration is larger than 100 ms. That is a weird construction because within the while loop you never recompute the fixation duration. That means that once it has its values from before the while loop, it will never gonna change, and therefore lead to an infinite loop (or to the loop if the fixation duration happens to be less than 100 ms).


    Maybe there are more problems with your code, I haven't checked it in detail, but generally, I can recommend you start simple. Try to implement a simple gaze contingent functionality and if it works extend it step-by-step until it works.

    I hope this helps. Good luck!

    Eduard

    Buy Me A Coffee

  • Here is my new code

    I am getting this traceback:

    Error while executing inline script

    Details

    item: fixdot

    phase: run

    item-stack: experiment[run].practice_loop[run].practice_block_seq[run].practice_block_loop[run].practice_trial_seq[run].fixdot[run]

    time: Thu Mar 18 14:09:02 2021

    exception type: TypeError

    exception message: unsupported operand type(s) for -: 'Legacy' and 'float'

    Traceback (also in debug window)

     File "C:\Program Files (x86)\OpenSesame\Lib\site-packages\libopensesame\inline_script.py", line 116, in run

      self.workspace._exec(self.crun)

     File "C:\Program Files (x86)\OpenSesame\Lib\site-packages\libopensesame\base_python_workspace.py", line 124, in _exec

      exec(bytecode, self._globals)

     Inline script, line 9, in <module>

     Inline script, line 6, in dist

    TypeError: unsupported operand type(s) for -: 'Legacy' and 'float'

    What does this mean?

  • Hello,

    Update: This issue is resolved, but I have a new one...

    I want to play an incorrect sound when the participant is not looking in one of the rectangles in my stimuli sketchpad. I created an inline script and a sampler for this, but if I place them before the sketchpad in the trial sequence, the sound doesn't play at all, and if I place them after it, the sound always plays, but not before the sketchpad starts fading.

    There might be an error in the code, but to find out what it is, I need to make sure that the code in my inline script only runs while the right sketchpad is displayed... What are ways to do this?

  • Hi @mrhmatar ,


    Sorry for the delay and glad to hear that you already resolved part of the issue.


    Indeed, OpenSesame runs all items in a sequence sequentially, so you will probably need to play the sound file from an inline script (for example in a Python loop where you are evaluating eye position). You can do this, for example, by:


    • Preparing a sampler item (for example called "my_sampler") in the trial sequence
    • Set its run-if statement to "never"
    • And run the item, if certain criteria are (not) met, from the inline_script with the following code:
    items.execute("my_sampler")
    


    If this doesn't help, could you upload the mos recent version of your experiment here?


    Cheers,


    Lotje

    Did you like my answer? Feel free to Buy Me A Coffee :)

  • Hello Lotje,

    No luck...

    Also, something strange is happening where I cannot see my mouse moving once the experiment gets past the fixation dot. This does not happen when I run another experiment on my very same computer.

    In all cases, here is the latest version of my code


    Thank you very much for your help,

    Mariah

  • Hi @mrhmatar ,


    Also, something strange is happening where I cannot see my mouse moving once the experiment gets past the fixation dot. This does not happen when I run another experiment on my very same computer.

    If you add a pygaze_start_recording (and pygaze_stop_recording) to your trial sequence, the cursor should appear (when running in advanced dummy mode).

    Unless I'm missing something (@sebastiaan or @eduard ?), I think the only way to apply gaze contingencies in the way you described it (checking whether participants look at an area of 40 px around the fixation dot for at least 400 ms) is to use a while loopin your Python inline_script item. Herewith an example, where a beep is played when participants look away from the fixation dot. See the comments for explanation.


    """
    DESCRIPTION:
    Experiment advances to the next item in the trial sequence if pp fixated
    within a 40 px area around the fixation dot for > 400 ms.
    
    To avoid the experiment from hanging, an overall timeout occurs if the
    fixation check takes more than 5 seconds.
    
    TODO mrhmatar:
    Decide if in the latter case, you want to skip the trial or to still
    continue
    """
    
    # Import modules:
    import math
    
    # For debugging purposes only:
    my_canvas = Canvas()
    
    # Define functions:
    # NOTE that I renamed this function because the math module also
    # contains a function called dist(), which led to confusion
    def getDist(x1,y1,x2,y2):
    
    	"""
    	Computes Cartesian distance between 2 points
    	"""
     
    	return math.sqrt((x2 - x1)**2 + (y2 - y1)**2)
    
    
    # Define variables:
    # Threshold variables:
    var.threshold = 40 # area around fix dot, in px
    var.timeout_ms = 5000 # ms 
    var.required_fixation_duration = 400 # ms
    
    
    # Coordinates of the fixation dot:
    var.x_fix = var.width/2
    var.y_fix = var.height/2
    
    # Set default variable:
    var.begin_trial = 0
    
    # Time stamp before starting the check:
    start_fixation_timing = clock.time() # used for fix-duration check:
    overall_starting_time = clock.time() # used for overall timeout:
    
    # Boolean that is used to keep running (or breaking from) 
    # the loop
    keep_running = True
    
    # Start the while loop:
    while keep_running:
        
        # Get overall duration:
        timestamp = clock.time()
        overall_time_passed = timestamp - overall_starting_time
        
        # If check took more than 5 ms:
        if overall_time_passed > var.timeout_ms:
            # Time out (break from loop)
            print("Fixation check timed out")
            keep_running = False
        
        # Get eye position:
        eye_x, eye_y = exp.pygaze_eyetracker.sample()
        # Calculate the distance:
        distance = getDist(eye_x,eye_y,var.x_fix, var.y_fix)
        
        # If pp fixates on fixdot:
        if distance < var.threshold:
            
            while True:
                
                # Get overall duration:
                timestamp = clock.time()
                overall_time_passed = timestamp - overall_starting_time
                
                # If check took more than 5 ms:
                if overall_time_passed > var.timeout_ms:
                    print("Fixation check timed out")
                    # Time out (break from loop)
                    keep_running = False
                    break
    
                
                # Keep checking fix pos:
                eye_x, eye_y = exp.pygaze_eyetracker.sample()
                distance = getDist(eye_x,eye_y,var.x_fix, var.y_fix)
                
                # If still fixating, start timing:
                if distance < var.threshold:
                    
                    # Reset the timer for the overall timeout
                    # (to make sure the variable begin_trial is not set
                    # to 0 if overall time out is passed during correct
                    # fixation)
                    overall_starting_time = clock.time() # used for overall timeout:
                    
                    # For debugging only:
                    my_canvas.fixdot(color = "green")
                    my_canvas.show()
                    
                    # Check fixation duration:
                    current_time = clock.time()
                    time_passed = current_time - start_fixation_timing
    
                    # If fixation duration is longer than required threshold:
                    if time_passed > var.required_fixation_duration:
                        
                        # Start trial set to 1:
                        var.begin_trial = 1
                        # Break from the while loops
                        keep_running = False
                        break
                # If pp lost fixation:
                else:
                    
                    # For debugging:
                    my_canvas.fixdot(color = "red")
                    my_canvas.show()
                    
                    # Reset timer:
                    start_fixation_timing = clock.time()
                    
                    # Play a warning beep:
                    items.execute("off_stim")
    
    
    print(var.begin_trial)
    


    I implemented this fixation check in a simplified version of your experiment.


    Hope this helps!


    Cheers,


    Lotje


    Did you like my answer? Feel free to Buy Me A Coffee :)

  • Hello Lotje,

    Thank you for your code, I adapted it and it helped me a lot!

    But my error sound is still not quite playing at the right time... I want it to play the sound when the participant is not looking at one of the word stimuli. It seems to be playing everytime I make a big saccade with my mouse regardless of the end position.


    So sorry that this is dragging so much ... it's my first time.

  • UPDATE: this is solved!

    Seems like the inline_script cannot call coordinates from the sketchpad written as (0, 160) etc

  • Hi,

    Good to hear that it works! Just for reference though, could you clarify what you mean with your last post?

    Eduard

    ps, sorry for having been absent the last week

    Buy Me A Coffee

  • Hello,

    I am having trouble calling coordinates of items from my sketchpad canvas in my code, even though I named them and followed the tutorials.

    For example, in the gaze_cont inline script, i want to give feedback to my participants according to whether they are looking at the target (color rectangle containing the target in green) or the opposite stimulus (color rectangle containing the neutral stimulus in red). But when I write for example:

    my_canvas = items['practice_stimuli'].canvas

    var.x_targ = my_canvas['targ'].width/2

    var.y_targ = my_canvas['targ'].height/2

    var.x_neut = my_canvas['neut'].width/2

    var.y_neut = my_canvas['neut'].height/2


    eye_x, eye_y = exp.pygaze_eyetracker.sample()

    dist_target = getDist(eye_x,eye_y,var.x_fix, var.y_targ)

    dist_neutral = getDist(eye_x,eye_y,var.x_fix, var.y_neut)

    if dist_target < var.threshold and (eye_x,eye_y) in my_canvas['rect_u']:

    feed_tug = copy_sketchpad('target_ug')

    feed_tug.show()

    break

    The canvas attributes in bold do not seem to be working. When I replace them with:

    var.x_fix = var.width/2

    var.y_fix_1 = var.height/4

    var.y_fix_2 = 3*var.y_fix_1

    the desired sketchpads are displayed, but since the target position changes in my experiment, it's not what I want.

    Sorry for the loooong reply. Here is the updated version if you'd like to see for yourself:


  • Hi,

    Sorry, but I had a somewhat hard time to read your code and see what the problem was, so I took the freedom to rewrite it my way, based on what I assumed you wanted to accomplish. Attached the script, but here some notes:

    • you should pay attention to the coordinate systems. Pygaze defined the origin (0,0) in the bottom left corner, Opensesame defines it in the middle of the screen, so that you need to correct for this shift.
    • Not sure, but I think your sound files dont work. I tried one I had on my laptop, and it did produce a sound whereas yours didn't. Maybe check that?
    • It is a bit harder (and unnecessary) to combine items with scripting, if you can do everything in scripting, or everything with items. Specifically, It makes more sense to define and play the sounds in scripts, then create a sampler item and put it somewhere into the sequence, just to call it some completely other place.
    • getting the distance to a target as a measure of fixation only makes sense if the target is circular (or you are willing to accept some inaccuracies around the edges of the rectangle (or whatever shape you are using). What you rather can do is checking whether the fixation lies within the borders of that item. I wrote a function (in the first inline_script) that does that.

    I hope this is useful to you, and that I didn't misinterpreted your goals all that much.

    Good luck,

    Eduard


    Buy Me A Coffee

  • edited March 2021

    Hello,

    Again, the script was very helpful. We're getting somewhere!

    I understand your confusion, it's a complicated task... here is a step by step explanation of what I would like it to look like:

    1. Gaze-contingent fixation dot: Start trial only after participant has fixated on fixation dot for about 400ms
    2. Display empty rectangles, followed by rectangles containing the stimuli
    3. Gaze-contingencies:
    • play incorrect sound if participant is not looking at one of the stimuli
    • color the rectangles containing the stimuli when the participant is looking at them: green if target, red if other stimulus (the color of the stimuli should remain so long as the participant is still fixating them , but return to black as soon as the participant stops looking)
    • play a correct sound and move to the next trial if the participant fixated on the target for a required duration of 2 seconds.

    My latest code is very close to what I would like to achieve, except that it is bugging like crazy.


    Thanks again for your help :)

  • Thanks for clarifying!

    play incorrect sound if participant is not looking at one of the stimuli

    I'm not sure about this part though. The two boxes are essentially all over the screen and it is hard to look anywhere that is not a stimulus without passing through the stimuli. Especially because the fixation dot is placed outside the two boxes, so actually already triggering the negative feedback. The only (easy) solution for this that I can see is to treat the entire central area, incl. the two stimuli as well as the area in between as one regions of interest, and if someone looks outside that box there will be a negative feedback.

    Would that work?

    Buy Me A Coffee

  • Sorry this is the latest code

    Starting to get lost with all these versions...

  • Hello eduard,

    Thank you for replying on a Saturday,

    I did not expect that and actually just saw your response!

    You are right about the boxes being really big. What I actually did in my code is that I played the sound (which works on my PC btw) whenever the participant is not looking at the point at the center of the rectangle, and I increased my threshold to 100 (which corresponds to roughly half the height of a rectangle). This gives a circular AOI with a diameter equal to the height of a rectangle, which seems to be working.

    The code I just sent seems to be doing what I want, except that the words are not always coloring when they should, or returning to black when they should. It seems like sometimes the eye position is not always being detected quickly enough. The sounds are playing correctly.

  • Hi,

    yeah, your code is overly complicated, which I tried to improve in my last post, but as I see, you prefer yours ;)

    Well, I don't give up and simplified again. The colors seem to behave the way you intend to. About the sound, I am not sure, but as you say it works correctly, I didn't change its behaviour.

    Hope this solves it then.

    Eduard


    Buy Me A Coffee

  • Thanks A MILLION!!! This is perfect.

    Thank you so much for your help.

    Sincerly,

    Mariah

  • Hello,

    Thanks again for your help in the past few days.

    I am trying to log the number of times the participant fixates on either the target or the opposite stimulus in my experiment.

    For this, I created 2 new variables: var.targ_fix and var.oppstim_fix that I later increment in exactly the same way right after each if condition is fulfilled.

    I kept everything else pretty much the same as it was before.

    var.targ_fix is behaving normally (i.e. it increments the first time the gaze is detected on the target and stops if it stays there), but var.offstim_fix keeps incrementing if the gaze is on the opposite stimulus, even if the gaze does not move. It looks like it is computing the fixation time. Attempts at fixing the value with a while loop failed...

    Best regards,

  • Hi @mrhmatar ,


    Could you upload the most recent version of your experiment and indicate where counting the number of fixations goes wrong?


    I do want to point out that I believe it is customary to do analyses such as these offline, on the eye-tracking output data, such that you can for example apply some corrections (for blinks, drift, etc.) and keep the coding in the (run-phase) in OpenSesame to the necessary minimum (for example to do gaze-contingent stuff).


    Hope this helps a little bit.


    Cheers,


    Lotje.

    Did you like my answer? Feel free to Buy Me A Coffee :)

Sign In or Register to comment.

agen judi bola , sportbook, casino, togel, number game, singapore, tangkas, basket, slot, poker, dominoqq, agen bola. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 50.000 ,- bonus cashback hingga 10% , diskon togel hingga 66% bisa bermain di android dan IOS kapanpun dan dimana pun. poker , bandarq , aduq, domino qq , dominobet. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 10.000 ,- bonus turnover 0.5% dan bonus referral 20%. Bonus - bonus yang dihadirkan bisa terbilang cukup tinggi dan memuaskan, anda hanya perlu memasang pada situs yang memberikan bursa pasaran terbaik yaitu http://45.77.173.118/ Bola168. Situs penyedia segala jenis permainan poker online kini semakin banyak ditemukan di Internet, salah satunya TahunQQ merupakan situs Agen Judi Domino66 Dan BandarQ Terpercaya yang mampu memberikan banyak provit bagi bettornya. Permainan Yang Di Sediakan Dewi365 Juga sangat banyak Dan menarik dan Peluang untuk memenangkan Taruhan Judi online ini juga sangat mudah . Mainkan Segera Taruhan Sportbook anda bersama Agen Judi Bola Bersama Dewi365 Kemenangan Anda Berapa pun akan Terbayarkan. Tersedia 9 macam permainan seru yang bisa kamu mainkan hanya di dalam 1 ID saja. Permainan seru yang tersedia seperti Poker, Domino QQ Dan juga BandarQ Online. Semuanya tersedia lengkap hanya di ABGQQ. Situs ABGQQ sangat mudah dimenangkan, kamu juga akan mendapatkan mega bonus dan setiap pemain berhak mendapatkan cashback mingguan. ABGQQ juga telah diakui sebagai Bandar Domino Online yang menjamin sistem FAIR PLAY disetiap permainan yang bisa dimainkan dengan deposit minimal hanya Rp.25.000. DEWI365 adalah Bandar Judi Bola Terpercaya & resmi dan terpercaya di indonesia. Situs judi bola ini menyediakan fasilitas bagi anda untuk dapat bermain memainkan permainan judi bola. Didalam situs ini memiliki berbagai permainan taruhan bola terlengkap seperti Sbobet, yang membuat DEWI365 menjadi situs judi bola terbaik dan terpercaya di Indonesia. Tentunya sebagai situs yang bertugas sebagai Bandar Poker Online pastinya akan berusaha untuk menjaga semua informasi dan keamanan yang terdapat di POKERQQ13. Kotakqq adalah situs Judi Poker Online Terpercayayang menyediakan 9 jenis permainan sakong online, dominoqq, domino99, bandarq, bandar ceme, aduq, poker online, bandar poker, balak66, perang baccarat, dan capsa susun. Dengan minimal deposit withdraw 15.000 Anda sudah bisa memainkan semua permaina pkv games di situs kami. Jackpot besar,Win rate tinggi, Fair play, PKV Games