[open] Implementing an Infant Preferential Looking(Listening) Procedure (PLP)
I`m new to OpenSesame. I followed the tutorials and got a basic sense of how it might work. I am interested in implementing a version of the so-called Preferential Looking Procedure (PLP) for infant studies. In the test phase, a child is sitting on in front of a monitor and is presented with sounds stimuli associated with attention-getting videos. The dependent variable is amount of looking time to the monitor while sound test items are played. Sound items come from two different categories (e.g., words versus part words, or other manipulation).
The test starts with a monitor displaying a silent attention getter video, and a sound test item begins to play. Importantly, for as long as the infant maintains their gaze on the central monitor and do not look away, the test trial continues for up to a maximum X seconds (e.g., 10 seconds). If the infant looks away for 2 consecutive seconds, the experimenter who is watching the infant through a camera ends the test trial and the next test item appears.
Here a few functionalities are required:
- how to play a silent video and a sound item at the same time?
- How to have the sound test item (e.g., "bago") play repeatedly (e.g., "bago...bago...bago...) up to X seconds.
- How to implement the keypress control such that the sound test item plays as long as a button is kept pressed, or if it is unpressed by the experimenter for less than 2 seconds.
If this worked, I believe several researchers running infant studies would be interested in using OpenSesame.