Touch movement of canvas in 20mb 360 degree Images
Hi all,
Hope you're well.
I'm running a Target Detection in 360-degree Images project. I'm comparing speed differences between VR and Flat Screen. For the Flat screen condition, I want to use a touch-based input to search/move through parts of an image.
In general images are 6000x3000 and 20mb in size. I anticipate there to be 50 images in each condition. I am concerned that this is an unusually large file size for OpenSesame. A Microsft Surface or Android Tablet will be used- the Surface will likely be more powerful.
When testing moving through the 360 around there is significant lag (5-10 seconds) and the experiment crashes with a memory error. I need to make it respond instantly or with no lag (what you would normally expect from moving around a photo on an iPad)
To test this touch interaction idea I used a slight adaptation (99%) of Edwin's python script from a 2014 thread (Thanks Edwin!)
from openexp.mouse import mouse
from openexp.canvas import canvas
my_mouse = mouse(exp)
my_canvas = canvas(exp)
path = exp.get_file(u'med.jpg')
my_canvas.image(path)
trialstart = my_canvas.show()
button, position, movstart = my_mouse.get_click(timeout=500)
while True:
pos, time = my_mouse.get_pos()
my_canvas.clear()
my_canvas.image(path, x=pos[0], y=pos[1])
timestamp = my_canvas.show()
movtime = timestamp - movstart
resptime = movstart - trialstart
exp.set("response", response)
exp.set("movetime", movtime)
exp.set("reponse_time", resptime)
Anyone have any ideas? or if this is possible? Ideally, the canvas shouldn't to render each time I move the view.
Cheers,
Daniel
Comments
Hi Daniel,
There's most likely a more efficient way to do this; however, I'm not sure I understand fully what you want to do. My understanding is that:
Is that correct?
Cheers,
Sebastiaan
Check out SigmundAI.eu for our OpenSesame AI assistant!
Yes that is all correct! The resolution of the display will be < 1/3 of the total picture. The exact scale I need is dependent image to image.
Hi Sebastiaan, thought I'd bump this Instead of using touch to move the image I'm going to use a joystick/aircraft yoke.
Any ideas on how to reduce the lag in the response?
I made a simple example of how you can implement this directly in
pygame
. This is much faster (I actually made it on an Ubuntu tablet, and even there it's snappy), but it does require that you use the legacy backend.Right now, the viewport is moved with the arrow keys of the keyboard; but you can easily change this into moving it around with a joystick/gamepad. You can do this with the OpenSesame
joystick
plugin, or directly inpygame
. Read the code comments to understand the logic; it's not that complicated, really.Hope this gets you started!
Check out SigmundAI.eu for our OpenSesame AI assistant!