Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

trail making test

Hi,

I am a new user of OpenSesame.

I am interested to implement the Trail Making Test in experiments (cf. sample picture, Instructions: "There are numbers in circles on this page. Please take the pencil and draw a line from one number to the next, in order. Start at 1 [point to the number], then go to 2 [point], then go to 3 [point], and so on. Please try not to lift the pen as you move from one number to the next. Work asquickly and accurately as you can.").


Is it possible to draw a line produced by user in real time?

How can I do this?

Thanks a lot for your help.

Comments

  • Hi Lucie,

    Yes, that's possible, and not even that complicated. But how would the user draw the line, though? With a finger on a touch screen? Or with a Wacom-like tablet? Or with the mouse?

    Cheers!

    Sebastiaan

  • Hi Sebastiaan,

    The user will draw the line with the mouse or with a finger on a touch screen.

    Thank you for your answer.

    Lucie

  • Hi Lucie,

    In that case, something like the logic below should do the trick. You may need to tweak the details though. But if you carefully read through the code, then you will probably be able to understand how it works and modify it for your purpose.

    Cheers!

    Sebastiaan

    MIN_DIST = 50
    
    # A sketchpad to draw onto
    my_canvas = items['my_sketchpad'].canvas
    my_keyboard = Keyboard(timeout=10)
    my_mouse = Mouse()
    my_mouse.show_cursor(True)
    
    vertices = []
    while True:
    	# Wait until a key has been pressed
    	key, timestamp = my_keyboard.get_key()
    	if key is not None:
    		break
    	if not any(my_mouse.get_pressed()):
    		continue
    	# Get the current cursor position
    	(x, y), timestamp = my_mouse.get_pos()
    	if vertices:
    		# Get the distance to the last stored point and skip
    		# if this too nearby so that we don't collect too
    		# many points
    		px, py = vertices[-1]
    		d = ((x - px) ** 2 + (y - py) ** 2) ** .5
    		if d < MIN_DIST:
    			continue
    		# Update the canvas, but only when there are at least two points
    		if len(vertices) >= 2:
    			my_canvas += Line(px, py, x, y)			
    	# Store the current mouse position
    	vertices.append((x, y))
    	my_canvas.show()
    # Assign vertices to experimental variable so that they are logged
    var.vertices = vertices
    
  • Hi Sebastiaan,

    Thank you so much.

    I look and contact you again if necessary.

    When it is done, the implemented test can be shared by example experiments?

    Best,

    Lucie

  • > When it is done, the implemented test can be shared by example experiments?

    That's very generous! 👍️

    There's no systematic system for sharing experiments. However, you could upload it to the OSF in a public project. That's what I usually do.

  • Hi Sebastiaan,

    I edited the experiment and I want to put it online.

    I understand inline_script must be inline_javascript but how?

    Thank you for your help,

    Best regard,


    Lucie


  • Hi Lucie,

    > I edited the experiment and I want to put it online. I understand inline_script must be inline_javascript but how?

    That depends with what you mean by putting it online.

    • If you simply want to share the experiment so that people can download it and run it on their computer, then your current implementation is fine. My initial impression was that this is what you would like to do, right?
    • However, if you want people to be able to run the experiment in a browser (instead of the desktop application), then you indeed need to translate the scripts to javascript, using the inline_javascript item. JavaScript and Python are different programming languages. This translation process will at least be tricky, and perhaps not even possible because the JavaScript API does currently not support everything that the Python API supports.

    I hope this clear things up!

    Cheers,

    Sebastiaan

  • Hi Sebastiaan,

    It is your second suggestion, I want people to be able to run the experiment in a browser. I am sorry but I do not find the "inline_javascript" item on OpenSesame.

    Could you help me?

    I will also share the experiment(that is already) with the community.

    Best regards,

    Lucie

  • > It is your second suggestion, I want people to be able to run the experiment in a browser. I am sorry but I do not find the "inline_javascript" item on OpenSesame.

    This item is new in OSWeb 1.3, which is included in OpenSesame 3.2.7. So you probably need to update! However, as I said, translating your scripts to JavaScript may be tricky!

  • I actually didn’t have the latest version...

    I will try because I need it.

    I will let you know if I make it.

    Another question: Is it possible to have the same visual rendering on the browser as on the computer?

    browser:

    computer:


    Thanks for all,

    Lucie

  • > Another question: Is it possible to have the same visual rendering on the browser as on the computer?

    If you want to have the exact same rendering, then I would create images with the text and show these. Otherwise there will be slight differences between the desktop and the browser (and possibly even between different browsers).

  • OK, thank you for this solution!

    Best regards,

    Lucie

Sign In or Register to comment.