Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

Delay the instruction

Hi.I want to know, if this possible to delay the instruction?. I use the expyriment.stimuli.TextScreen to present the stimuli. I am intended to make dual instruction (visual and auditory) but I want delay both of them with sequence (visual delay 2 ms, but not the auditory and vice versa). Thanks and congratulation for Expyriment 10th anniversary guys!!. ?

Comments

  • Hi Tanto,

    >I use the expyriment.stimuli.TextScreen to present the stimuli. I am intended to make dual instruction (visual and auditory) but I want delay both of them with sequence (visual delay 2 ms, but not the auditory and vice versa).

    If I understand you correctly, you would like to present visual and auditory stimuli in succession, with a temporal difference of 2ms. Such very low visual AND auditory timing will be tough to get. However, it might be possible. Here is what you would need:

    1. For the visual accuracy: A monitor that has (a) a refresh rate of 500Hz and (b) a latency (for switching on/off the single pixels) that is less than 2ms. I am not sure how common those are, or if they exist at all.
    2. For the auditory accuracy: This is hard to achieve in general. You would need a professional audio interface with really good drivers (e.g. from RME). You would then need something like PyAudio to address those special drivers (i.e. ASIO drivers on Windows), and to set the buffer size to as low as possible (possibly < 64 samples) without getting any distortions. Alternatively, instead of an audio interface, you could use an external hardware sampler to play the sounds, which you then trigger with MIDI (although I am not sure if the latency would be low enough then).

    Given 1 and 2 above, you can then just delay the presentation of either by doing a clock.wait(2000).

    I hope this helps.


    >Thanks and congratulation for Expyriment 10th anniversary guys!!. ?

    Cheers!

  • Hi, Florian

    Thank for the fast reply and for the explanation. From your suggestion 3000 ms is suitable timing.

    I have question with non-english character such as arabic literal. Follow your explanation, I can produce the non-alphabet character if I use expyriment.stimuli.TextLine(text_font='arabic_font.ttf'). But I fail to produce using u"arabic_font". It only return in [][][][] symbol from screen. Fyi I still use python 2 and expyriment 0.9.

    Thanks!

Sign In or Register to comment.