extra timestamps being logged
For my eye-tracking experiment, I've incorporated inline scripts to generate log messages within the coroutines. These messages follow a format like:
eyetracker.log("Start wav at: %d" % clock.time())
Let me illustrate with an example. I have an audio sampler initiating at 1000ms. To capture this event, I've integrated the above script into the coroutines, adjusting the start and end times accordingly to coincide with the audio start time (1000ms both). However, I encountered an issue where duplicate log entries appear both in the log file and the eye tracker TSV file. I tried changing the time to 998ms in case it had problems executing several items at the same time, but the result was the same.
When I set the start and end time to the same number (e.g., 1000), I have 2 duplicates, and the bigger the difference I set, the more duplicate logs I get. I am attaching log, csv, and tsv (the file is too big, attaching first 3 trials) files. I notice some discrepancies between the timings as well. For example, when the distance between "start of object" and "start of object + 200ms" should be equal to 200ms, it can differ by 20ms or so in the tsv file. Additionally, I've observed discrepancies in timing within the TSV file, where expected intervals of 200ms sometimes deviate by around 20ms.
I've attached log, CSV, and TSV files (first 3 trials due to file size limitations) for reference. If I can't resolve the problem in the script, is it possible to clean up the data during post-processing?
I am also quite unsure about how to further combine data from csv and tsv files for the analysis. My statistical skills are limited, but I am a fast learner. I have little knowledge of R, Excel, and SQL. But no knowledge of Python. Could you provide suggestions on how to proceed with data integration and analysis given my skill set?