[open] Visual Presentation Times Vary Massively and Regularly.
EDIT: No need to read the disheartened rant below. The end comment reveals all.
As you may or may not know I've been working on an experiment where audio stimuli has been synchronised almost exactly with visual stimuli.
However, the length of time that visual stimuli is held to the screen is very inaccurate.
I instruct OS to display one canvas which has a circle for 0ms (monitor refresh rate is 100Hz and in the past we've had to reduce to the refresh rate number 1 frame below the one we want) and this is followed by another canvas with a circle in the same position for 40ms (which equates to 50ms). In total this should mean a circle being displayed for 60ms.
When running a small number of trials in a block there is some variation. The majority are displayed for 60ms as instructed but there is some with wild variation anywhere between 70ms and 120ms.
This gets even more bizarre when I increase the number of trials per block. In fact it has the reverse effect.
The displays times plummet to 20ms in some cases after an X amount of blocks and appear to hover around the 30ms point. This is even perceptually noticeable without even checking output (which does confirm this plummet).
This really is the last hurdle before running this experiment. We've achieved so much with getting highly accurate audio/visual syncing and have been working on it for at least 4 months straight now (not including Xmas break).
Is there any reason this may be happening? Is there a solution to it?
Thanks for your time,
A disheartened Boo.