trackR is an object tracker for R based on OpenCV. It provides an easy-to-use (or so I think) graphical interface allowing users to perform basic multi-object video tracking in a range of conditions while maintaining individual identities.
trackR implements three different methods to detect objects in a video:
nimages taken at regular intervals throughout the video. This method should work well in most lab situations with a constant and homogenous background.
nimages taken at regular intervals throughout the video. A pixel outside of these bounds in a given image would be considered as being part of an object. This method should work better than the previous one in situations were the background is constant but not homogenous (e.g. if the background has different shades).
trackR also allows users to exclude parts of the image by using black and white masks that can be easily created and customized using any available image editor.
trackR compare to other video tracking solutions? Did we really need another one?
trackR belongs to the category of the ‘classical’ tracking programs. It relies on good ol’ fashion image processing and simple assignment algorithms (the Hungarian method in this case, plus some k-means clustering trickery inspired by the excellent
tracktor for Python).
trackR does not include (for now) any fancy machine learning methods like those that can be found in the fantastic
idtracker.ai for instance. The downside is that
trackR’s tracking reliability is inferior to the more advanced software (in particular when the objects cross paths); the upside is that it is fast, does not require a beast of a computer to run, and therefore will soon be able to do live tracking for instance.
trackR is more similar in spirit to tracking software such as
tracktor and the sadly defunct
SwisTrack. It will most likely provide tracking reliability equivalent to these excellent programs. However,
trackR provides multiple algorithms to segment the objects from the background (see above) and should therefore be capable of handling a wider variety of situations. It is also coded in a way that should facilitate the addition of other segmentation and tracking algorithms in the future in order to broaden even further its range of applicability.
Will something break? Can I use
trackR in ‘production’ mode?
Something will definitely break. This is version 0.3 of the software, there is still a long way to go before it is a fully finished program. This being said, it will work fine in most cases and is certainly usable for most tracking projects. If you run into an issue, please report it at: https://github.com/swarm-lab/trackR/issues.
What features are in the works for future versions of
At the moment, I am considering including the following features in future iterations of
How can I help?
trackR is an open-source project, meaning that you can freely modify its code and implement new functionalities. If you have coding skills, you are more than welcome to contribute new code or code improvement by submitting pull requests on the GitHub repository of the project at: https://github.com/swarm-lab/trackR. I will do my best to review and integrate them quickly.
If you do not feel like contributing code, you can also help by submitting bug reports and feature requests using the issue tracker on the GitHub repository of the project at: https://github.com/swarm-lab/trackR/issues. These are extremely helpful to catch and correct errors in the code, and to guide the development of
trackR by integrating functionalities requested by the community.