Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

[open] Libet's rotating spot clock - Intentional Binding Paradigm

edited June 2012 in OpenSesame

Hi. I'm trying to make a rotaing spot clock like the one used in Libet's volition expermient, to use it to measure the Intentional binding effect. I got lots of help on this forum, reading other entrie on the subject. here.
http://forum.cogsci.nl/index.php?p=/discussion/38/solved-intentional-binding-paradigm-creating-animations/p1

I was able to make an inline code that displays the clock with a rotating red ball, marking the revolutions every 2560 ms and then to mark the time which a key is pressed ("z"). 1500 ms after the keypress, a synth sound plays, and the sound time is recorded. Then the clock is stopped a random range of time after. I'm missing the method to introduce the percieved time of the two events.
Although my code does the job, i think it's rather rough. I would appreciate if someone could look in to my code and maybe help me find a more elegant solution. thanks!

from openexp.canvas import canvas
from openexp.keyboard import keyboard
from openexp.synth import synth
from math import radians, atan2, sin, cos, degrees  # Trigonometry to build rotating spot
import random, time

#initialize synth, sine wave
my_synth = synth(self.experiment, osc = "sine", freq = "b1", attack = 1, length = 200)

#initialice canvas and path of clock image
my_canvas = canvas(self.experiment)
my_canvas.show()
path = self.experiment.get_file("clock2.png")

#initialize keyboard
my_keyboard = keyboard(self.experiment, keylist = ["z"],timeout= 0)

clock_radius = 75 #radius of the clock spot.
speed = 140.625 # speed to make 1 turn of the clock in 2560 ms
start_t = time.time() - random.randint(0, 2560) # needed to make the startinf position of the clock random.
half_screen = 250.5  #had to do this to draw the circle relative to the center of the screen. I dont now if canvas can manage "center" relative coordinates.

press_time = 100000000000 #set this variables high enough, to avoid entering the conditions.
finish_time = 100000000000
delay = random.randint(2, 5)

while True: #start the clock
    t = time.time() - start_t
    dy = clock_radius * sin(radians(t * speed))
    dx = clock_radius * cos(radians(t * speed))
    my_canvas.clear()
    my_canvas.circle(dx + half_screen,dy + half_screen,5, fill=True); my_canvas.image(path)
    my_canvas.show()
    mark = atan2(dy,dx)*10 + 15   # fix the result of the atan2 function to give clock marks
    if mark < 0:        
        mark = (60 + mark)
    key, f_time = my_keyboard.get_key()
    if key==122: #mark the time of the key press
        key_mark=mark
        press_time=time.time() 
    if time.time() - press_time > 1.5: #set a 1.5s interval before synth play
        my_synth.play()
        sound_mark=mark
        press_time = 100000000000000
        finish_time=time.time()
    if time.time() - finish_time > delay: #set a random (delay) interval before breaking
        print key_mark, sound_mark
        break

Comments

  • edited 7:30PM

    Hi Caiman,

    Your code looks pretty good (and works), but I take it you're not satisfied with it? I don't understand the following:

    I'm missing the method to introduce the percieved time of the two events.

    What exactly do you want to achieve that the script isn't doing right now?

    Furthermore, two minor tips:

    • The timing of sound playback is in general far less precise than that of display presentation, which is something to keep in mind (I'm not sure if it matters for you).
    • You can get the center of the display with canvas.xcenter() and canvas.ycenter() respectively, see http://osdoc.cogsci.nl/python-inline-code/canvas-functions

    Cheers,
    Sebastiaan

  • edited 7:30PM

    Thanks Sebastiaan.
    The intentional binding effect is a "temporal compression" of the percieved time of the onset of an action and it's consequence. When a person is asked to retrospectively inform of the timing of 1.- an intentional movement and 2.- its consequence, in this case a sound , the movement is percieved later in time, and the sound is percieved before of their "real" timings. What's exiting about it, its that when there is no "intention" to move (as in passive movements), the effect is not seen, or you can see an inverse effect (the percieved time between the two conditions expand).

    Im building a script that can ask de users for the perceived timing of the two events. I'm using the mouse module and I think i can get it to work. The user is asked to mark the position on the clock, when they percieved the urge to move, and then when they percieved the sound.

    I would like to restrain de mouse movement to a certain area, maybe just the clock perimeter, for better accuracy. Is this possible?

    Sound timing is important, if the delay is more or less the same for each try, it should'nt be a problem. Do you know another method for a more precise sound stimule? maybe using the system sound, as it doesn't require sound card processing?

    here's the updated code.

    
    from openexp.canvas import canvas
    from openexp.keyboard import keyboard
    from openexp.mouse import mouse
    from openexp.synth import synth
    from math import radians, atan2, sin, cos, degrees  # Trigonometry to build rotating spot
    import random, time
    
    #initialice mous
    my_mouse = mouse(self.experiment)
    
    #initialize synth, sine wave
    my_synth = synth(self.experiment, osc = "sine", freq = "b1", attack = 1, length = 200)
    
    #initialice canvas and path of clock image
    my_canvas = canvas(self.experiment)
    my_canvas.show()
    path = self.experiment.get_file("clock2.png")
    
    #initialize keyboard
    my_keyboard = keyboard(self.experiment, keylist = ["z"],timeout= 0)
    
    clock_radius = 75 #radius of the clock spot.
    speed = 140.625 # speed to make 1 turn of the clock in 2560 ms
    start_t = time.time() - random.randint(0, 2560) # needed to make the startinf position of the clock random.
    half_screen = 250.5  #had to do this to draw the circle relative to the center of the screen. I dont now if canvas can manage "center" relative coordinates.
    
    press_time = 100000000000 #set this variables high enough, to avoid entering the conditions.
    finish_time = 100000000000
    delay = random.randint(2, 5)
    
    user_key = 0
    sound_key = 0
    
    while True: #start the clock
        t = time.time() - start_t
        dy = clock_radius * sin(radians(t * speed))
        dx = clock_radius * cos(radians(t * speed))
        my_canvas.clear()
        my_canvas.circle(dx + half_screen,dy + half_screen,5, fill=True); my_canvas.image(path)
        my_canvas.show()
        mark = atan2(dy,dx)*10 + 15   # fix the result of the atan2 function to give clock marks
        if mark < 0:        
            mark = (60 + mark)
        key, f_time = my_keyboard.get_key()
        if key==122: #mark the time of the key press
            key_mark=mark
            press_time=time.time() 
        if time.time() - press_time > 1.5: #set a 1.5s interval before synth play
            my_synth.play()
            sound_mark=mark
            press_time = 100000000000000
            finish_time=time.time()
        if time.time() - finish_time > delay: #set a random (delay) interval before breaking
            break
    
    while True:
        button, position, timestamp = my_mouse.get_click(timeout = 20)
        pos, time = my_mouse.get_pos()
        my_canvas.clear()
        my_canvas.text("time of pressing", center=True, x=None, y=10, color=None)
        my_canvas.circle(pos[0],pos[1],5, fill=True); my_canvas.image(path)
        my_canvas.show()
        mark_post = atan2(pos[1] - half_screen,pos[0] - half_screen)*10 + 15
        if mark_post < 0 : mark_post = (60 + mark_post)
        if button == 1:
            user_key = mark_post
            break
    while True:
        button, position, timestamp = my_mouse.get_click(timeout = 20)
        pos, time = my_mouse.get_pos()
        my_canvas.clear()
        my_canvas.text("time of sound", center=True, x=None, y=10, color=None)
        my_canvas.circle(pos[0],pos[1],5, fill=True); my_canvas.image(path)
        my_canvas.show()
        mark_post = atan2(pos[1] - half_screen,pos[0] - half_screen)*10 + 15
        if mark_post < 0 : mark_post = (60 + mark_post)
        if button == 1:
            sound_key = mark_post
            break
    print key_mark - user_key, sound_mark - sound_key
    
  • edited March 2013

    Regarding a better (restrained) way to get the response, I was thinking you could draw a straight line from the center in the direction of the cursor. So basically the participant controls the arm of a clock with his mouse. What do you think? Something like this would do the trick:

    from openexp.mouse import mouse
    from openexp.canvas import canvas
    from math import atan2, cos, sin
    
    # Uncomment for 0.25 and before
    # exp = self.experiment
    
    # Create a canvas and get center coordinates
    my_canvas = canvas(exp)
    xc = my_canvas.xcenter()
    yc = my_canvas.ycenter()
    
    # Create a mouse
    my_mouse = mouse(exp, timeout=10)
    
    # The radius of the line
    r = 300
    
    # Loop until a button is pressed
    while True: 
    
        # Get a button press and exit on press
        button, position, timestamp = my_mouse.get_click()
        if button != None:
            break       
    
        # Get the position of the cursor
        pos, time = my_mouse.get_pos()
    
        # Draw not the cursor directly, but a line from the display
        # center in the direction of the cursor
        dx = pos[0] - xc
        dy = pos[1] - yc
        a = atan2(dy, dx)   
        x = cos(a) * r + xc
        y = sin(a) * r + yc
    
        # Draw the canvas
        my_canvas.clear()
        my_canvas.line(xc, yc, x, y)
        my_canvas.show()
    
    

    Regarding the sound. There is a fixed delay and a random jitter. How big the jitter will be is hard to say (you'd need to test on a specific system), but it might be on the order of 20ms. You can reduce this (somewhat) by reducing the sound buffer size (see the back-end options in the general tab) to say 256, but you might get crackling in the sound.

    Using the system beep, or a device attached to the serial/ parallel port should give you essentially perfect timing. I don't have a system beep myself (it's kind of old fashioned), so I cannot be of much assistance here. But maybe this will get you started: http://docs.python.org/library/winsound.html

    Good luck!

  • edited 7:30PM

    Thanks again Sebastiaan.
    I've finally made the script work with your help. I had to fix de atan2 function with a "degrees" because it was giving me the numbers in radians, but once that was fixed it all ran well.
    Now i'm trying to organize my expermient in order to run it in several blocks measuring different thing. One block that the user only estimates the time of key pressing, with no sound, other block with the estimation of only the sound, without the key press... etc.
    My question now is if i can make an inline script at the begining of the experiment, to define the variables that are the same for all the blocks (like the clock_radius, the speed, etc) and then call them on the specific blocks.
    Thanks for your help!

  • edited 7:30PM

    Sure, the following command in an inline_script will set a variable that is accessible throughout the experiment:

    # Uncomment for 0.25 and before
    #exp = self.experiment
    exp.set('my_variable', 'my_value')

    Alternatively you can edit the general script (General tab → Show script editor) and add something like the following to the top of the script:

    set my_variable my_value

    Both methods will do the same thing.

    Cheers!

  • edited 7:30PM

    Hi Caiman. We have the same research interests, very well! :)

    Actually, I developed a quite different code to creating the intentional binding paradigm. I use the PsychoPy backend, since it seems to be more precise. Moreover, I prefere to create in advance the various clock hand positions, and then to show them in sequence for each frame rate. In this way, the rotation of the clock and the milliseconds precision seem to be improved (but this is just my opinion). For example, the rotation appears more fluid. With the Psycho backend, the canvas module doesn't work well, so I had to use the 'visual' modules.

    In my code, the subject judge the time by typing the seconds number. I use this method only because this is the same used by Haggard and coll. in their intentional binding studies. I also developed a code that allows the subject to manually positioning the clock hand at the disired position, but I don't use it for now. It would be interesting to verify if the two judgement methods determine different results (I guess not)...

    I attach my code, apologizing very much for its length (Sebastian, I hope this isn't a problem, I have to use more than one post). It's only for one condition, in which the subject presses a key, hear a tone and judges the time. I hope this may be useful for you, Caiman, and for who will want to study the intentional binding. I would appreciate any suggestions about my code, if someone has the opportunity to test it.

    Best,

    Andrea

  • edited March 2013

    from psychopy import visual, core, event, logging, sound from math import radians, atan2, sin, cos, floor from pyglet.window import key import random global raggio, center_point, cerchio, point5, point10, point15, point20 global point25, point30, point35, point40, point45, point50, point55, point60 global five, ten, fifteen, twenty, twentyfive, thirty, thirtyfive, forty global fortyfive, fifty, fiftyfive, sixty ### SET INITIAL VARIABLES AND CONSTANTS ### monitor_freq = 60.0 # set the monitor refresh rate ms = float((1/monitor_freq)*1000.0) # compute milliseconds refresh rate beep = sound.Sound(value = 1000, secs = 0.100) raggio = 110 # Radius of the clock face arm_len = 80 # Lenght of the clock hand duration = 2.5 # Duration of revolution in seconds start_point = random.randint(0, 11) # Choose one of the 12 positions of the clock ### CREATING CLOCK ### center_point = visual.Circle(self.experiment.window, radius=3) center_point.setFillColor("black") cerchio = visual.Circle(self.experiment.window, radius=raggio) cerchio.setLineColor("black") ### 5 ### point5 = visual.Line(self.experiment.window, start = [raggio * cos(radians(60)), raggio * sin(radians(60))], end = [(raggio+10) * cos(radians(60)), (raggio+10)* sin(radians(60))]) point5.setLineColor("black") five = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(60)), (raggio + 30) * sin(radians(60))), text='5', height=30, color = 'black', font='Times') ### 10 ### point10 = visual.Line(self.experiment.window, start = [raggio * cos(radians(30)), raggio * sin(radians(30))], end = [(raggio+10) * cos(radians(30)), (raggio+10) * sin(radians(30))]) point10.setLineColor("black") ten = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(30)), (raggio + 30) * sin(radians(30))), text='10', height=30, color = 'black', font='Times') ### 15 ### point15 = visual.Line(self.experiment.window, start = [raggio * cos(radians(360)), raggio * sin(radians(360))], end = [(raggio+10) * cos(radians(360)), (raggio+10) * sin(radians(360))]) point15.setLineColor("black") fifteen = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(360)), (raggio + 30) * sin(radians(360))), text='15', height=30, color = 'black', font='Times') ### 20 ### point20 = visual.Line(self.experiment.window, start = [raggio * cos(radians(330)), raggio * sin(radians(330))], end = [(raggio+10) * cos(radians(330)), (raggio+10) * sin(radians(330))]) point20.setLineColor("black") twenty = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(330)), (raggio + 30) * sin(radians(330))), text='20', height=30, color = 'black', font='Times') ### 25 ### point25 = visual.Line(self.experiment.window, start = [raggio * cos(radians(300)), raggio * sin(radians(300))], end = [(raggio+10) * cos(radians(300)), (raggio+10) * sin(radians(300))]) point25.setLineColor("black") twentyfive = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(300)), (raggio + 30) * sin(radians(300))), text='25', height=30, color = 'black', font='Times') ### 30 ### point30 = visual.Line(self.experiment.window, start = [raggio * cos(radians(270)), raggio * sin(radians(270))], end = [(raggio+10) * cos(radians(270)), (raggio+10) * sin(radians(270))]) point30.setLineColor("black") thirty = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(270)), (raggio + 30) * sin(radians(270))), text='30', height=30, color = 'black', font='Times') ### 35 ### point35 = visual.Line(self.experiment.window, start = [raggio * cos(radians(240)), raggio * sin(radians(240))], end = [(raggio+10) * cos(radians(240)), (raggio+10) * sin(radians(240))]) point35.setLineColor("black") thirtyfive = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(240)), (raggio + 30) * sin(radians(240))), text='35', height=30, color = 'black', font='Times') ### 40 ### point40 = visual.Line(self.experiment.window, start = [raggio * cos(radians(210)), raggio * sin(radians(210))], end = [(raggio+10) * cos(radians(210)), (raggio+10) * sin(radians(210))]) point40.setLineColor("black") forty = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(210)), (raggio + 30) * sin(radians(210))), text='40', height=30, color = 'black', font='Times') ### 45 ### point45 = visual.Line(self.experiment.window, start = [raggio * cos(radians(180)), raggio * sin(radians(180))], end = [(raggio+10) * cos(radians(180)), (raggio+10) * sin(radians(180))]) point45.setLineColor("black") fortyfive = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(180)), (raggio + 30) * sin(radians(180))), text='45', height=30, color = 'black', font='Times') ### 50 ### point50 = visual.Line(self.experiment.window, start = [raggio * cos(radians(150)), raggio * sin(radians(150))], end = [(raggio+10) * cos(radians(150)), (raggio+10) * sin(radians(150))]) point50.setLineColor("black") fifty = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(150)), (raggio + 30) * sin(radians(150))), text='50', height=30, color = 'black', font='Times') ### 55 ### point55 = visual.Line(self.experiment.window, start = [raggio * cos(radians(120)), raggio * sin(radians(120))], end = [(raggio+10) * cos(radians(120)), (raggio+10) * sin(radians(120))]) point55.setLineColor("black") fiftyfive = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(120)), (raggio + 30) * sin(radians(120))), text='55', height=30, color = 'black', font='Times') ### 60 ### point60 = visual.Line(self.experiment.window, start = [raggio * cos(radians(90)), raggio * sin(radians(90))], end = [(raggio+10) * cos(radians(90)), (raggio+10) * sin(radians(90))]) point60.setLineColor("black") sixty = visual.TextStim(self.experiment.window, pos=((raggio + 30) * cos(radians(90)), (raggio + 30) * sin(radians(90))), text='60', height=30, color = 'black', font='Times') ### DRAWING CLOCK FUNCTION ### def clock_face(): center_point.draw() cerchio.draw() point5.draw() five.draw() point10.draw() ten.draw() point15.draw() fifteen.draw() point20.draw() twenty.draw() point25.draw() twentyfive.draw() point30.draw() thirty.draw() point35.draw() thirtyfive.draw() point40.draw() forty.draw() point45.draw() fortyfive.draw() point50.draw() fifty.draw() point55.draw() fiftyfive.draw() point60.draw() sixty.draw() ### CREATE HAND POSITIONS FOR ROTATION ### hand_list = [] hands_number = int(round((duration*1000.0)/ms)) f = 0 for time in range (hands_number): dx = int(arm_len * cos(radians((start_point * 30) - f))) dy = int(arm_len * sin(radians((start_point * 30) - f))) hand = visual.Line(self.experiment.window, start = [0, 0], end = [dx, dy]) hand.setLineColor("black") hand.setFillColor("black") hand_list.append(hand) f = f + (360.0/hands_number)
  • edited March 2013
                                ############################
                                ###   START EXPERIMENT   ###
                                ############################
    
    
                        ### INITIAL BLANK SCREEN FOR 500 MS ###
    
    start = core.Clock()                    
    start.reset()
    while start.getTime() <= 0.5:
        self.experiment.window.flip()
    
                        ### INITIAL STATIC POSITION OF THE CLOCK ###
    
    prep = visual.TextStim(self.experiment.window, text='Prepararsi...', pos = (10, -260), font = 'Times', height = 28, color='black')
    dx = arm_len * cos(radians(start_point * 30))
    dy = arm_len * sin(radians(start_point * 30))
    hand = visual.Line(self.experiment.window, start = [0, 0], end = [dx, dy])
    hand.setLineColor("black")
    
    start.reset()
    while start.getTime() <= 1.5:
        clock_face()
        prep.draw()
        hand.draw()
        self.experiment.window.flip()
    
    
                                ### START ROTATION ###
    
    ran = int(random.uniform(1.500, 2.500))
    time = core.Clock()
    time.reset()
    repeat = True
    event.clearEvents()
    while repeat == True:
        frame = 0
        for i in range (len(hand_list)):
            space = event.getKeys(keyList = ['space'], timeStamped = time)
            if len(space) > 0:
                repeat = False
                break
            clock_face()
            hand_list[frame].draw() 
            self.experiment.window.flip()
            frame = frame + 1
    
    # space pressed
    rt = space[0][1]
    while True:
        if frame == (len(hand_list)-1):
            frame = 0
        clock_face()
        hand_list[frame].draw() 
        self.experiment.window.flip()
        frame = frame + 1
        # beep after 250ms
        if time.getTime() >= rt + 0.245 or 250 <= abs(duration - rt) + time.getTime() <= 0.255:
            beep.play()
            start_beep = time.getTime()
            # continue to rotate for a random period...
            while time.getTime() <= start_beep + ran:
                clock_face()
                hand_list[frame].draw() 
                self.experiment.window.flip()
                if frame == (len(hand_list)-1):
                    frame = -1
                frame = frame + 1
            # ...then stop
            break
    
    giri = int(rt/duration)
    press_point = rt - (duration*giri)
    
    
                                ### JUDGEMENT PHASE ###
    
    
    numb = [""]
    
    response = key.KeyStateHandler()
    self.experiment.window.winHandle.push_handlers(response)
    
    testo = visual.TextStim(self.experiment.window, color='black', height = 25, font= 'Times', pos=(0,-250), text='Inserire tempo stimato ')
    valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(-7,-300), text='')
    valore2 = visual.TextStim(self.experiment.window, height = 25, color='black', font= 'Times', pos=(7,-300), text='')
    rett = visual.Rect(self.experiment.window, height=30, width = 50, fillColor='#5b5881', pos = (0,-302))
    
    while not response[key.RETURN]:
        press = event.getKeys(keyList=['1','2','3','4','5','6','7','8','9','0'])
        clock_face()
        hand_list[frame].draw() 
        testo.draw()
        rett.draw()
        valore1.draw()
        valore2.draw()
        self.experiment.window.flip()
    
        # deleting numbers
        if response[key.BACKSPACE] == True and len(numb) > 1:
            if len(numb) == 3:
                numb.pop()
                valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(-7,-300), text='%s' % numb[len(numb)-2])
                valore2 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(7,-300), text='')
                valore1.draw()
                valore2.draw()
            if len(numb) == 2:
                numb.pop()
                valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(-7,-300), text='')
                valore2 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(7,-300), text='')
                valore1.draw()
                valore2.draw()
    
        # inserting numbers
        if len(press)>0:
            if len(numb) == 1:
                numb.append(press[0])
                valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(0,-300), text='%s' % numb[len(numb)-1])
                valore1.draw()
            elif len(numb) == 2:
                numb.append(press[0])
                valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(-7,-300), text='%s' % numb[len(numb)-2])
                valore2 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(7,-300), text='%s' % numb[len(numb)-1])
                valore1.draw()
                valore2.draw()
        if len(numb) > 1:
            judgement = int("".join(str(x) for x in numb))  # transform the numbers list in integer
        else:
            judgement = 0
        # discard values out of allowed range
        if judgement > 60: 
            valore1 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(-7,-300), text='')
            valore2 = visual.TextStim(self.experiment.window, height = 25, color='black', font='Times', pos=(7,-300), text='')
            valore1.draw()
            valore2.draw()
            numb=[""]
    
    event.clearEvents()
    
    
                    ### CONVERTING JUDGEMENT IN MILLISECONDS ###
    
    # Start point must returns a value between 5 and 60, so...
    if start_point == 0:
        start_point = 15
    if start_point == 1:
        start_point = 10
    if start_point == 2:
        start_point = 5
    if start_point == 3:
        start_point = 60
    if start_point == 4:
        start_point = 55
    if start_point == 5:
        start_point = 50
    if start_point == 6:
        start_point = 45
    if start_point == 7:
        start_point = 40
    if start_point == 8:
        start_point = 35
    if start_point == 9:
        start_point = 30
    if start_point == 10:
        start_point = 25
    if start_point == 11:
        start_point = 20
    if judgement == "":
        judgement = "00"
    
    # Transform judged clock position in milliseconds
    if judgement >= start_point:
        converted_judgement = 0.0417 * (judgement - start_point)
    else:
        converted_judgement = 0.0417 * (judgement + 60 - start_point)
    
    # print final values on screen (only for test phase)
    print("Start point: ", start_point)
    print("Recorded key pressed: ", press_point)
    print("Beep: ", start_beep, "Interval: ", start_beep - rt)
    print("Judgement: ", judgement, converted_judgement)
    print("Error: ", converted_judgement - press_point)
    
    
  • zakzak
    edited 7:30PM

    Hi Andrea,

    First of all, thanks for sharing your code. I am trying to make a Libet paradigm as well, I am really doing just a traditional Libet experiment but adding an electrophysiological intervention. Your code works well -- would you mind if I modify it and use it for my experiment?

    If so, would you mind sharing the code to manually position the clock hand to the desired position when reporting intention time?

    Thanks again! You saved me a lot of hard work :)

  • edited March 2013

    Hi zak. Here the code you requested.

    For various reasons, I don't use this method. I think one problem is that manually positioning the clock hand requires that the subject has to wait until the clock hand reaches the choosen position. I don't now if this may be a possible bias. On the other hand, if you increase the clock hand rotation speed, the judgement becomes less precise and the clock hand "slips out". So, I now prefer to ask the subject to comunicate the seconds position. BTW, this is the code, let me know if it works well.

    Best,

    Andrea

                         ### Creating a second list of clock hand positions ###
    
    hand_list2 = []
    hands_number2 = int(hands_number * 360.0/hands_number) # you can manipulate this parameter, so increasing or decreasing the clock hand speed
    f = 0
    for time in range (hands_number2):
        dx = int(arm_len * cos(radians((start_point * 30) - f)))
        dy = int(arm_len * sin(radians((start_point * 30) - f)))
        hand = visual.Line(self.experiment.window, start = [0, 0], end = [dx, 
                dy])
        hand.setLineColor("black")
        hand.setFillColor("black")
        hand_list2.append(hand)
        f = f + (360.0/hands_number2)
    
                                      ### START JUDGEMENT PHASE ####
    
    response = key.KeyStateHandler()
    self.experiment.window.winHandle.push_handlers(response)
    
    frame = int(frame * (360.0/hands_number)) # update the frame value, so it can be used in hand_list2
    while not response[key.RETURN]: # stationary clock hand
        clock_face()
        hand_list2[frame].draw()
        estimate.draw()
        self.experiment.window.flip()
        while response[key.RIGHT]: # clock hand moves clockwise while pressing the right arrow key
            if frame == (len(hand_list2)-1):
                frame = -1
            frame = frame + 1
            hand_list2[frame].draw()
            clock_face()
            estimate.draw()
            self.experiment.window.flip()
        while response[key.LEFT]: # clock hand movescounterclockwise while pressing the left arrow key
            if frame == 0:
                frame = len(hand_list2)
            frame = frame - 1
            hand_list2[frame].draw()
            clock_face()
            estimate.draw()
            self.experiment.window.flip()
    else: # when ENTER is pressed, record the position of the clock hand and finish
        judgement = (float(frame) * (duration/hands_number2))
    
  • zakzak
    edited December 2013

    Thanks Andrea! I ended up coding something to use a mouse for the judgement phase, based on what Sebastiaan wrote but with psychopy mouse commands. It allows the subject to report the clock position by moving the mouse, but the hand is constrained to appropriate clock positions. Then it converts the x,y position of the hand at the click to clock position (this was slightly tricky). This method may be somewhat more accurate because the subject can report non-integer clock positions.

    It also meant I could avoid buying a $5000 EEG-compatible fiber optic keypad! (though the fiber optic mouse is still $2000 :S)

    Feel free to use it if you like, and thanks again for sharing.

                                ### JUDGEMENT PHASE ###
    
    # need degrees for this!
    from math import degrees
    
    # create a mouse and mouse variables
    mouse = event.Mouse(visible = True, newPos = [0,0], win = self.experiment.window)
    mouse.clickReset()
    buttons = mouse.getPressed()
    pos = mouse.getPos()
    
    while True:
        # get mouse position and clicks
        pos = mouse.getPos()
        buttons = mouse.getPressed()
    
        # if there is a click, then record the hand position to the judgement coordinates
        if buttons != [0,0,0]:
            x_judge = dx
            y_judge = dy
            break   
    
        # draw the hand and clock
        a = atan2(pos[1],pos[0])
        dx = int(arm_len * cos(a))
        dy = int(arm_len * sin(a))
        hand = visual.Circle(self.experiment.window, radius = 7)
        hand.setPos([dx,dy])
        hand.setFillColor("red")
        hand.setLineColor("red")
        hand.draw()
        clock_face_img.draw()
        self.experiment.window.flip()
    
    
    # convert x,y to clock position. needs different conversions by quadrant
    
    # Quadrant 1
    if x_judge > 0 and y_judge >= 0:
        judge_degs = degrees(atan2(y_judge,x_judge))
        judgement = (-judge_degs/6)+15
    
    # Quadrant 2
    if x_judge <= 0 and y_judge >= 0:
        judge_degs = degrees(atan2(y_judge,x_judge))
        judgement = (-judge_degs/6)+75
    
    # Quadrant 3 & 4
    if x_judge < 0 and y_judge < 0 or x_judge >= 0 and y_judge < 0:
        judge_degs = degrees(atan2(y_judge,x_judge)) + 360
        judgement = (-judge_degs/6)+75
    
    

    Some other changes I made you might have noticed: I use a red dot instead of a white line for the hand. I also draw the clock face to an image buffer before drawing it. There is some loss of image sharpness but this was necessary because I added 1s tick marks, which made it run very slowly. I did it like this:


    #draw the clock face on the back buffer and record it to a buffer clock_face_list = [center_point, circle1, point1, point2, point3, point4, point5, five, point6, point7, point8, point9, point10, ten, point11, point12, point13, point14, point15, fifteen, point16, point17, point18, point19, point20, twenty, point21, point22, point23, point24, point25, twentyfive, point26, point27, point28, point29, point30, thirty, point31, point32, point33, point34, point35, thirtyfive, point36, point37, point38, point39, point40, forty, point41, point42, point43, point44, point45, fortyfive, point46, point47, point48, point49, point50, fifty, point51, point52, point53, point54, point55, fiftyfive, point56, point57, point58, point59, point60, sixty] clock_face_img = visual.BufferImageStim(win, stim = clock_face_list)
  • edited March 2013

    Hi Zak. In these days I'm attempting to improve the code that I used. Now, I also use BufferImageStim to draw the clock face, and it allows a faster presentation of the stimuli. This is the code that I use to create the clock face. I don't draw the 1s tick marks, but you can add them inside the loop. Then, I call the clock_face.draw() two consecutive times, when I have to draw it, because this make the clock face more clear.
    Best,

    Andrea

    # Creating clock face and save screenshot in a buffer
    tics = 12
    for time in range (2):
        visual.Circle(self.experiment.window, radius=3, fillColor='black').draw()
        visual.Circle(self.experiment.window, radius=circle_radius, lineWidth=2, lineColor='black').draw()
        num = [15, 10, 5, 60, 55, 50, 45, 40, 35, 30, 25, 20] # clock marks
        i = 0
        for angleDeg in range(0, 360, int(360/tics)):
            angleRad = radians(angleDeg)
            begin = [circle_radius*sin(angleRad), circle_radius*cos(angleRad)]
            end = [begin[0]*1.1, begin[1]*1.1]
            posTxt = [(circle_radius+int(360/tics))*cos(angleRad), (circle_radius+int(360/tics))*sin(angleRad)]
            visual.Line(win, start=(begin[0],begin[1]), end=(end[0],end[1]), 
            lineColor='black', lineWidth=2).draw()
            visual.TextStim(self.experiment.window, pos=posTxt, text='%s' %num[i], height=30, color = 'black', font='Times').draw()
            i = i+1
    
    clock_face = visual.BufferImageStim(self.experiment.window)
    self.experiment.window.clearBuffer()
    
  • zakzak
    edited 7:30PM

    Hi Andrea,

    I'm not sure if you are still doing this experiment but I noticed a bug since I have been using the code. When converting the start point from a value (0-11) to a clock face value (5-60), the code says
    if start_point == 0: start_point = 15 if start_point == 1: start_point = 10 etc.

    The problem with this is it says if start_point == 1: start_point = 10 and then later if start_point == 10: start_point = 25 . This means that every time the start point is 10, it is being recorded as 25 instead. The same problem occurs if the original start point is 5. It becomes 50 instead. This means the converted_judgement calculations are incorrect or ambiguous on 1/3 of trials (anytime the start point was 5, 10, 25, or 50)

    Just thought you should know, in case you did not find this bug yourself. You can fix it by simply moving the commands for if start_point = 5 and if start_point = 10 to the top of the list.

  • edited March 2013

    Hi zak. Thank you for reporting this bug. I had not noticed the problematic behavior of the code, because after I had posted here my experiment I modified it in order to make it more reliable. A change I made is just related to the selection of the start point of the clock hand and the subsequent conversion of the value in a clock mark number. I inserted a python dictionary with the various start points of the clock hand (the keys) and the relative converted clock marks (the values).

    num =   {
            0:15, 30:10, 60:5, 90:60, 120:55, 150:50, 180:45, 210:40, 240:35, 
            270:30, 300:25, 330:20
            }
    

    So, the start point is every time randomly choosen:

    start_point = random.choice(num.keys()) 
    

    and used for the initial clock hand position. Then it is converted at the end of the code, before computing final data:

    start_point = num[start_point]
    

    This appears more elegant.

    Since they works well, I think I will share my final intentional binding experiments here in the next days. The only problem is that the instructions are in italian...

    Specifically, I have developed both the classic version (from Haggard et al. 2002) and the probability version (from Moore and Haggard, 2008), in which the probability of occurrence of the tone is manipulated in two blocks. I'm using this latter experiment for my current research, since it distinguishes both predictive and retrospective components of the feeling of agency.

    Cheers,

    Andrea

  • edited 7:30PM

    Hi,

    I am interested in using the Intentional Binding task. Will one of you by any chance be willing to share the code of the task with me or refer me to someone who might be able to help me?. unfortunately, my programming skills are poor, therefore it will assist me greatly.

    Thanks a lot!
    Ela :)

  • edited December 2013

    Hi ElaOren. Yes, of course, I can share the code with you once I get a bit of time. A question: are you interested in the original version of the intentional binding paradigm (Haggard et al. 2002) or in the modified version in which the probability of occurrence of the tone changes (Moore & Haggard, 2008)?

  • edited 7:30PM

    Hi Andrea, I'm very interested to the Libet clock paradigm and I would like to use it for some researches. Unfortunately I'm naive of the opensesame scripts. Can you help me?
    Many thanks.
    Fabio

  • edited 7:30PM

    Hi Fabio (are you italian, like me?). You can contact me at andreaepifani[at]gmail.com. I'll be glad to help you.

  • edited 7:30PM

    I'm also interested in finding an implementation of the Libet paradigm and this is the only only one I've found that I've managed to get to work. Unfortunately my programming skills are non-existent so I'm finding this all extremely taxing.

    Hopefully embodiment doesn't mind if I send an email asking for the latest code, and if anyone else has used Opensesame for a Libet-style experiment I'd be really interested to hear about it!

  • Hi everybody,
    I'm trying to replicate Libet's experiment using an openbci EEG, I tried embodiment's program using psychopy it gives me an error :

    My knowledge in programming is very limited, if anyone can help ?

    Thank you in advance.

  • Hi all,

    I am hoping to do some simple (Haggard-style) Intentional Binding experiments with my final year dissertation students next academic year and I was wondering if any of you would be willing to share your OpenSesame script with me so I can edit it for my own ends?

    Many thanks in advance,

    Cai

  • Hi Cai,

    Maybe you can email Fabian (email above). In case you haven't already done so.

    Good luck,

    Eduard

    Buy Me A Coffee

  • Dear all,

    I am planning to use the Libet's Clock Task in my next research about intentional binding. I was able to reach Andrea, who was very kind in sending me his original script.

    However, I have some questions:

    1. I am running the experiment through an old version of OpenSesame (i.e., OpenSesame 3.2.8 Kafkaesque Koffka), and the task works. However, when a student of mine had tried to use the most recent version of OpenSesame, the task did not work. Has anyone recently used the script with an updated version of the software? 

    2. During the task, I can’t pause the experiment. When I try to do it, the experiment goes into a loop (the clock continues to move and it is unstoppable); this might be a problem considering that I plan to use it with patients.

    Thank you,

    Federica 

  • Hi Federica,

    It is probably best if you gave more information on the current implementation of the task. You can share the experiment, or the code.

    What do you mean with "pausing the experiment"? Participants pressing a key to halt, and then another one to continue? Or building the experiment in a block/trial sequence?

    Eduard

    Buy Me A Coffee

Sign In or Register to comment.

agen judi bola , sportbook, casino, togel, number game, singapore, tangkas, basket, slot, poker, dominoqq, agen bola. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 50.000 ,- bonus cashback hingga 10% , diskon togel hingga 66% bisa bermain di android dan IOS kapanpun dan dimana pun. poker , bandarq , aduq, domino qq , dominobet. Semua permainan bisa dimainkan hanya dengan 1 ID. minimal deposit 10.000 ,- bonus turnover 0.5% dan bonus referral 20%. Bonus - bonus yang dihadirkan bisa terbilang cukup tinggi dan memuaskan, anda hanya perlu memasang pada situs yang memberikan bursa pasaran terbaik yaitu http://45.77.173.118/ Bola168. Situs penyedia segala jenis permainan poker online kini semakin banyak ditemukan di Internet, salah satunya TahunQQ merupakan situs Agen Judi Domino66 Dan BandarQ Terpercaya yang mampu memberikan banyak provit bagi bettornya. Permainan Yang Di Sediakan Dewi365 Juga sangat banyak Dan menarik dan Peluang untuk memenangkan Taruhan Judi online ini juga sangat mudah . Mainkan Segera Taruhan Sportbook anda bersama Agen Judi Bola Bersama Dewi365 Kemenangan Anda Berapa pun akan Terbayarkan. Tersedia 9 macam permainan seru yang bisa kamu mainkan hanya di dalam 1 ID saja. Permainan seru yang tersedia seperti Poker, Domino QQ Dan juga BandarQ Online. Semuanya tersedia lengkap hanya di ABGQQ. Situs ABGQQ sangat mudah dimenangkan, kamu juga akan mendapatkan mega bonus dan setiap pemain berhak mendapatkan cashback mingguan. ABGQQ juga telah diakui sebagai Bandar Domino Online yang menjamin sistem FAIR PLAY disetiap permainan yang bisa dimainkan dengan deposit minimal hanya Rp.25.000. DEWI365 adalah Bandar Judi Bola Terpercaya & resmi dan terpercaya di indonesia. Situs judi bola ini menyediakan fasilitas bagi anda untuk dapat bermain memainkan permainan judi bola. Didalam situs ini memiliki berbagai permainan taruhan bola terlengkap seperti Sbobet, yang membuat DEWI365 menjadi situs judi bola terbaik dan terpercaya di Indonesia. Tentunya sebagai situs yang bertugas sebagai Bandar Poker Online pastinya akan berusaha untuk menjaga semua informasi dan keamanan yang terdapat di POKERQQ13. Kotakqq adalah situs Judi Poker Online Terpercayayang menyediakan 9 jenis permainan sakong online, dominoqq, domino99, bandarq, bandar ceme, aduq, poker online, bandar poker, balak66, perang baccarat, dan capsa susun. Dengan minimal deposit withdraw 15.000 Anda sudah bisa memainkan semua permaina pkv games di situs kami. Jackpot besar,Win rate tinggi, Fair play, PKV Games