Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

[open] Swiping left/right on an Android tablet

edited May 2015 in OpenSesame

Hello,

I was just wondering if someone could help. A while back, I started this thread:

http://forum.cogsci.nl/index.php?p=/discussion/comment/4290#Comment_4290

It was incredibly helpful with what I wanted to do (basically, present an image to a participant on an android tablet and allow the participant to swipe the image either towards them [to the bottom of the screen] or away from them [to the top of the screen], and record the swipe speed [movtime], reaction time, response [whether it was swiped to the top or the bottom of the screen] and the experiment time). This worked wonderfully and I'm incredibly grateful for all the help I received.

Now, I simply would like to change the swipe direction from up and down/towards and away from the participant, to left and right (like Tinder, if you are familiar with this!), and to be able to record the same variables mentioned above.

I was therefore wondering which bit of code I would need to manipulate in order to achieve this?

Thanks very much in advance,

Charlotte

Comments

  • edited 6:51AM

    Hi Charlotte,

    That depends on what your current code looks like of course! I'm expecting that you have at least two fixed pairs of coordinates in your code (top and bottom), and that you keep track of the finger position to process whether- and what kind of swipe is being made?

    The easiest logic, then, would be to just change the two pairs of coordinates from something that represents top and bottom, to something that represents left and right. But again it depends on how your previous experiment was designed; but I can't imagine it would be more complex than changing something that keeps track of an y-coordinate into something that keeps track of an x-coordinate. Since you're the expert of your own code, you probably know which bit of code I'm talking about :)

    Good luck!

    Josh

  • edited May 2015

    Hi Josh,

    Thanks for your advice, it's worked perfectly!

    My other question is, I would like to customise the canvas that I'm using (i.e. have it display a certain graphic/colour) instead of it being plain white. Is there any way for me to do this?

    My code looks like this:

    from openexp.mouse import mouse
    from openexp.canvas import canvas
    my_mouse = mouse(exp)
    my_canvas = canvas(exp)
    path = exp.get_file(u'Child_M_1.png')
    left = exp.get("width") - 200
    right = 200
    my_canvas.image(path)
    trialstart = my_canvas.show()
    button, position, movstart = my_mouse.get_click(timeout=None)
    exp.set ("position_Child_M_1", position)
    while True:
        pos, time = my_mouse.get_pos()
        my_canvas.clear()
        my_canvas.image(path, x=pos[0], y=pos[1])
        timestamp = my_canvas.show()
        if pos[0] < right:
            response = "left"
            break
        if pos[0] < right:
            response = "left"
            break
        if pos[0] > left:
            response = "right"
            break
        # calculate the movement time
        movtime = timestamp - movstart
        # calculate the RT
        resptime = movstart - trialstart
    exp.set("response_Child_M_1", response) 
    exp.set("movetime_Child_M_1", movtime)
    exp.set("reponse_time_Child_M_1", resptime)
    

    Thanks!

  • edited 6:51AM

    Hi,

    If it's only a color you can use canvas.set_bgcolor(color) (see doc) within you script. Maybe it is even an option for you to set it in the general experiment settings. To do so, click on the very first element in the overview area and choose there the color you prefer.

    If it is about an image as background, you can do the same trick as you do right now for presenting the stimuli. Only difference will be that you present the background image first and stretched over the entire screen. Later, if you draw your stimuli, the will just put on top of what has been drawn earlier.

    Is this what you meant?

    Eduard

    Buy Me A Coffee

Sign In or Register to comment.