moving stimuli and mouseclick collection in interception task
I have a question about the following.
I am designing an experiment in which people have to intercept a moving stimulus by a mouseclick.
I now have programmed this by:
- Preparing a series of positions for the stimulus for each frame in the prepare window.
- Looping over X frames in the run window. In the loop, I
- 1) adjust the canvas for that frame by selecting the x-th position of the stimulus in the series,
- 2) do canvas.show() for that frame,
- 3) collect a mouse click by my_mouse.get_click(timeout=20).
The timing of the frames is not correct, so apparently this is not a good approach. I read this approach would be best if screen flips are stable, but what would alternative methods be to get the interception of the target?
Does anyone have a similar experiment who could share his/her code with me?
Thanks in advance,
Nina
Comments
Hi Nina,
I would use sketchpads for this purpose, but directly code it in Python, which is actually quite straightforward and way more efficient. I attach an experiment that demonstrates the basic principle. Let me know if you need help, then I can try to give more pointers.
If you insist on doing this with sketchpads, you should use feedback items and not sketchpads, as you probably want the action happen during the run phase.
good luck,
Eduard
Thanks! I already coded it, did not use sketchpads. I will have a look at your code, thanks a lot!
Thank you Eduard! Your code works!
One more question: I would like to make sure that each frame is depicted after 16.7 ms (= 1/frame rate). In the results of your code, I see that it matters how much I set as a "refresh" value. I set it to 15, 12 or 5 (as in your code) and with 15, some frames take 17 or 18 ms, with 5 frames take only 7 ms.
How can I best make sure that each frame lasts for 16.7 ms?
To add to this: these frame durations differ a lot when I select the Psycho back-end (frames take then 33 ms) instead of the Legacy back-end (frame times as reported in previous post).
Are you sure? I measured the timing of my script above and the timing is overall decent. When measuring the timing you should make sure that your screen is in the "proper" mode. For example, if you have multiple screens attached, the timing will be noticeably worse. How have you measured the timing? If you don't see anything wrong with your screen, you can share your experiment. Maybe I can spot something.
To add to this: these frame durations differ a lot when I select the Psycho back-end (frames take then 33 ms) instead of the Legacy back-end (frame times as reported in previous post).
See https://osdoc.cogsci.nl/3.3/manual/timing/#making-the-refresh-deadline
Legacy is not ideal when timing is critical.
Answer to your first question I don't know ;) I think I am...
I have tried disconnecting the extra screen and running the code with the run fullscreen mode (instead of the quick run or run in window modes). Are there other things that make the screen as in the proper mode or not? I intend to run the code in Psycho indeed, but I found higher frame times there.
I measure the frame times as the difference between the time stamps of the canvas.show() commands.
I have added the experiment below, it includes an adaptation of your code. I hope you spot some mistake in it! When I run it, I get a lot of Psycho-warnings like psycho:107:WARNING Canvas.show() took 32.855 ms. Some of these warnings then show 17 ms for example, but in the frame times that I calculate as described above it is sometimes different from these warning values.
I hope you spot the mistake in the code that used!
Thanks a lot again!
Maybe that is related to your computer. On my machine I get pretty decent timing with your script:
Occasionally, a psychopy warning like yours, but not even on every run. Is yours much worse? Try it on one of the machines that you will eventually run the experiment on. You won't be able to avoid all glitches, but if the issue is in the ~1 ms range, I wouldn't worry about inaccuracies. Even an occasional 20-30 ms frame won't mess things up. Participants certainly won't notice.
Thanks Eduard! I will try on different machines then.