How to use other filters than simple Kalman in Motion-Based Multiple Object Tracking Example

12 views (last 30 days)
Hi,
I have found the Motion-Based Multiple Object Tracking Example very useful in various problems. The example states at the end: "The likelihood of tracking errors can be reduced by using a more complex motion model, such as constant acceleration, or by using multiple Kalman filters for every object. Also, you can incorporate other cues for associating detections over time, such as size, shape, and color. "
I would like to try different filters such as those listed in Matlab as usable in the predict and correct functions:
Filter for object tracking, specified as one of these objects:
How would this be incorporated here? Would it involve the vision.Kalmanfilter? How?
Are there any examples of these or the other cues "associating detections over time, such as size, shape, and color." available. I searched the community and could not find any.
Thank you

Accepted Answer

Elad Kivelevitch
Elad Kivelevitch on 28 Apr 2022
Hi Peter,
Thanks for the question.
The example that you refer to uses the vision.KalmanFilter object, which is a linear Kalman filter that assumes that both the motion and the measurements are modeled as linear models. Furthermore, the example uses some helper functions to associate new measurements with existing tracked objects, initialize new tracked objects, update existing ones, and delete ones that are no longer present.
There are two ways to move forward from this example to other filters and models. The first way is to still use the same helper functions, and replace the vision.KalmanFilter with one of the filters you listed. As you correctly noted, for a filter to be compatible, it must provide a few object functions (methods):
  • Predict - to predict the object state from one time step to the next.
  • Correct - to correct the object state with a new measurement.
  • Distance - to aid in computing the association cost that is used in the association stage.
The above filters all support these methods. The easiest one to convert to, and I recommend starting with that, would be the trackingKF, which is very similar to vision.KalmanFilter. You will need to define a bounding box model, for example on how to do that, please see: https://www.mathworks.com/help/driving/ug/multiple-object-tracking-tutorial.html
After doing that, if you want to try using EKF or UKF, you will need to define the appropriate motion and measurement model functions. You can see the constvel and cvmeas functions for inspiration. Then, simply use the filter with these models by defining the StateTransitionFcn and MeasurementFcn, accordingly.
You can stop here, or you can decide to move to the next step.
The next step could be replacing all the tracking helper function with a tracker. Once again, I recommend looking at the https://www.mathworks.com/help/driving/ug/multiple-object-tracking-tutorial.html example to see how to set up a tracker and how to run it. You can use any of the following trackers: trackerGNN, trackerJPDA, and trackerTOMHT with any of the filters listed in the question. To choose a filter, simply define the FilterInitializationFcn. You may want to look at the same FilterInitializationFcn used in the example I linked to above for that.
Finally, to learn more about tracking and trackers, please look at the documentation for the Sensor Fusion and Tracking Toolbox.
Good luck
Elad
  4 Comments
Peter
Peter on 2 May 2022
Elad,
Sorry to bother you again. I have looked at the examples you list and I realize that I do not have enough fundamental background to figure this out. Could you recommend books, tutorials, etc? Although I did 3 years of math in college, I probably remember about 1.
Thank you,
Peter

Sign in to comment.

More Answers (1)

Peter
Peter on 29 Apr 2022
Elad,
Thank you again. This will give me alot of things to try out.
Peter

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!