Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Supported by

[open] offline video tracking

edited March 2013 in Miscellaneous

Hi,

I was wondering if Mantra can also be used offline on existing video material to define and track visual objects? If so, would you need to add specific markers when recording it?

Thanks in advance,
Rebecca

Comments

  • edited 9:22AM

    Hi Rebecca,

    I was wondering if Mantra can also be used offline on existing video material to define and track visual objects?

    Not really. Mantra is really built around the idea of tracking objects in real-time and passing this data on to an experiment.

    However, the object-tracking algorithm used in Mantra is really simple, and if you just want to analyze a video offline, you can easily implement it in Python using tools such as OpenCV and NumPy. The basic trick would be to read video frames one-by-one into a matrix, and for each frame match all pixels based on some color criterion, and calculate the average position of the matching pixels. This is exactly what Mantra does.

    If so, would you need to add specific markers when recording it?

    If you attach distinctly colored markers to your objects, and make sure that the lighting conditions are good and uniform, tracking will be much easier. As soon as you have to do 'proper' object-tracking instead of just counting colored pixels, things quickly become very difficult. OpenCV has some routines for this, which (depending on your experience) you might be interested in. Personally I don't have any experience with this, though.

    Cheers!
    Sebastiaan

    There's much bigger issues in the world, I know. But I first have to take care of the world I know.
    cogsci.nl/smathot

Sign In or Register to comment.