Touch response for an IAT
I am trying to create an IAT about gender attitudes. I am not at all familiar with programming in Python and I was wondering if you could offer some tips.
Using the IAT template for windows , available here: https://osf.io/v8yaj/
I tried to create an android based IAT experiment with images but I am struggling with the response option. The experiment works with the keyboard response, but ideally, I would like to use a touch response as we will be conducting this experiment with tablets in the field. I played around with the options, but so far have failed to incorporate a touch response instead of a keyboard response (I think the inline script would require some additional lines?) Could you kindly give me some hints on what to add to the inline scripts?
Comments
Hi Sarah,
Looking at the experiment (
IAT Template - with pool - english.osexp) all the responses are handled withkeyboard_responseitems. So that's quite convenient, because it means that you can simply replace those withtouch_responseitems.Next, you'd need to change how the correct response is determined, which seems to be done in the set_correct_and_color item. If, in the
touch_response, you define a grid with two columns and one row, the left side of the screen will be coded as1(instead of theekey), and the right side as2(instead of theikey).Does that get you started?
Cheers!
Sebastiaan
Check out SigmundAI.eu for our OpenSesame AI assistant!
Hi Sebastiaan,
I was able to make the corrections and all seems to work well. I will be taking my experiment to the field next month. Thank you very much for your help!
Sarah
Good luck!
Check out SigmundAI.eu for our OpenSesame AI assistant!