Welcome!

Sign in with your CogSci, Facebook, Google, or Twitter account.

Or register to create a new account.

We'll use your information only for signing in to this forum.

Supported by

Touch response coordinates

Hello everyone!

First of all, I'm loving the experiment builder! I'm an absolute newbie still, but the idea of running experiments on a tablet opens up so many possibilities, and I'm definitely excited to try!

One question I have is whether it is possible to get the coordinates of a touch response in pixel space (or close to it). As I understand it, Open Sesame allows to divide the screen into rows and columns so that each resulting region has its own discrete identifier. However, for an experiment I am trying to build, I would need the location of the response to be as continuous as possible, in order to then calculate the angle between the touch location and another point on the screen. I know that detecting locations on a touchscreen is by necessity a matter of approximation to an extent, but I was wondering if Open Sesame would allow me to do this within the accuracy limits of the hardware itself.

I hope this makes sense. I probably shouldn't try to explain myself after midnight.

Comments

  • sebastiaansebastiaan Posts: 2,737

    Hi Fabio,

    That's what the mouse_response item does! When working with a touch screen, a touch is (for most purposes) equivalent to a mouse click.

    Cheers,
    Sebastiaan

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

  • FabioFabio Posts: 12

    Thank you, that makes sense. I guess I took "mouse response" quite literally :)

  • FabioFabio Posts: 12

    I'm sorry to resurrect this thread, but I've only just been able to get back to OpenSesame and I'm now finding it rather tricky to build my experiment.

    The sequence involves a visual stimulus being presented on a sketchpad item while an audio recording plays through a sampler item. That much I can do. However, I then need participants to make a touchscreen response on a new sketchpad, and I need an arrow element to be drawn on it with the apex of the arrow pointing in the direction of the location participants touched on the screen. Think of it as a compass.

    That's where I'm stuck, as I'm not sure how to reference the screen coordinates of the touch response. Any help welcome!

  • eduardeduard Posts: 875

    Hi Fabio,

    A touch response is essentially behaving like a mouse response. However, I think it will be rather tricky to get the coordinates of a click by only using the plugin. Probably it is better if you used some inline_scripting, see here.

    The second return value of mouse.get_click()gives you the coordinates of that click. These you can use the draw the arrow into the direction, you want. Will you also need help with that, or is that reasonably clear?

    At any rate, let me know if you need more help.

    Eduard

  • FabioFabio Posts: 12

    Hi Eduard,

    Yes, I ended up using an inline script to do this, it now works just as I need it to. Just in case anyone else needs to do something similar, this is my code.

    This part sets up the canvas that participants will use to perform their response. I actually create two canvases, one with an arrow element and one without. That way, when participants make their response, the canvas and arrow will be re-drawn, giving the impression that the arrow has moved. There may be a more elegant way of achieving that effect, but it works for my purposes.

    starting_canvas = canvas()
    my_mouse = mouse()
    
    while True:
            starting_canvas.clear()
            starting_canvas.text("INSTRUCTIONS", center=True, x=0.0, y=372.0, color='white')
            starting_canvas.circle(0, 0, 300, color='white')
            default_canvas = canvas()
            default_canvas.copy(starting_canvas)
            starting_canvas.arrow(0, 0, 0, -300, body_width=0.5, head_width=30, fill = True)
    

    Here I record the time the canvas first appears, and collect the both the time the response is made and the coordinates of the response. These are saved as a tuple, but I also store the X and Y as separate variables, the values of which I will use later to re-draw the arrow. NOTE: The my_mouse.show_cursor line was meant to help during testing of the script on Windows (the actual experiment will run on Android), but it never actually worked. I believe this is a known issue with some backends.

            t0 = starting_canvas.show()
            starting_canvas.show()
            my_mouse.show_cursor(show=True)
            button, position, t1 = my_mouse.get_click()
            var.pos_tuple = position
            var.xpos = position[0]
            var.ypos = position[1]
            var.arrow_length = xy_distance(0, 0, var.xpos, var.ypos)
    

    Here, the touch response triggers the clearing of the canvas, which is replaced by the original one without the arrow. Then a new arrow is drawn on it with the X and Y of participants' responses as the position of the arrow head.

            if position != None:
                starting_canvas.clear()
                default_canvas.show()
                default_canvas.arrow(0, 0, var.xpos, var.ypos, body_width=0.5, head_width=30, fill = True)
                default_canvas.show()
    

    This bit computes the response time, and the angle of the response (i.e. the angle between the 0 degree vector of the circle and the arrow vector).

                var.RT = t1 - t0
                var.rad_angle = math.atan2((0-var.xpos),(0-var.ypos))
                var.deg_angle = math.degrees(var.rad_angle)*-1
                if 0-(var.xpos) > 0:
                    var.deg_angle = 360-math.degrees(var.rad_angle)
                print('angle is (%f)' % (var.deg_angle))
                var.error = var.deg_angle - var.correct
                print('error is (%f)' % (var.error))
                clock.sleep(1000)
                break
    

    I hope this saves someone a lot of trial and error! I'm sure there are more elegant and efficient ways of writing this, so feedback is welcome.

    Thanked by 1eduard
  • eduardeduard Posts: 875

    Great! Thanks for sharing.

Sign In or Register to comment.