Jitter
Hey Everyone,
I'm running an experiment and need to implement a jitter between trials. I need the jitter to range from 1100ms to 4600ms and be random. The instructions for the advanced_delay plugin are confusing to me...I think I put the average of the two numbers in the duration box, which would be 2850ms. I've also put 870ms in the jitter box, with a standard deviation jitter. Can anyone provide any guidance on this?
Thanks!
Comments
Hi,
If you want to sample uniformly between 1100 and 4600, then you would indeed put the duration to the mean of both, which is 2850, and the jitter to 1750 (i.e. the difference between the mean and the minimum and maximum values). Does that clear things up?
Cheers!
Sebastiaan
Check out SigmundAI.eu for our OpenSesame AI assistant!
Yes! Although I don’t want it to be a uniform jitter...I am looking to have it vary between trials but never be greater or less than the min and max above. Does that change things?
If you don't want the jitter to be uniform, then I assume that you want to sample the duration from a gaussian distribution, but specify a minimum and a maximum value as well. To do this, I would use a simple
inline_scriptlike the one below. For optimal timing, you could put the part that does the random sampling in the Prepare phase, and only the sleep part in the Run phase. (It's good practice to do that, although it makes little practical difference in this case.)Prepare phase:
import random RANDOM_MEAN = 2850 RANDOM_STD = 500 RANDOM_MIN = 1100 RANDOM_MAX = 4600 while True: var.duration = random.gauss(mu=RANDOM_MEAN, sigma=RANDOM_STD) if RANDOM_MIN < var.duration < RANDOM_MAX: breakRun phase:
Check out SigmundAI.eu for our OpenSesame AI assistant!