Should track and measurement be in the same frame or coordinates system and if not how will the assignment work?

1 view (last 30 days)
Im trying to achieve centralized tracking with camera and radar raw measurements. I track in world coordinates and I use measurement function in Kalman filter to map my state to the raw measurement. I distinguish between the sensors using SensorIndex as both have different measurement function. I have few questions here.
1) If I have only camera measurements for this time instance, the tracker (trackerGNN,trackerJPDA,trackerTOMHT) will predict the state using the kalman filter model for camera from the sensor index I specified. Now , it will look for detections to assign and if there is a radar detection which satisfies the assignment cost, it should assign the track to this radar detection. For, the next prediction step, will it switch to the radar kalman filter model specified by sensor index ?? as now it should update with radar measurements so to calculate residual from this equation :
y = Z - h(x)
y is residual,
Z is measurements,
h() is measurement fuction which maps my state to measurement space,
x is my track state in world coordinates.
2) If the normalized distance is calculated from the residual for the assignment, My residual calculation happens in measurement space but I would like to track in state space which is world coordinates. How can I achieve this?

Answers (1)

Elad Kivelevitch
Elad Kivelevitch on 3 Aug 2023
You mention that your measurement function already knows how to handle different sensor models based on SensorIndex. So, the tracker should be able to do the right thing without any additional work for you.
In other words, given a measurement function h of the following format:
function zexp = h(state, measurementParameter)
% measurementParameters are supplied to the function by the tracker from
% the objectDetection.MeasurementParameters property.
if measurementParameters.SensorInex == 1 % Let's say 1 is vision model
zexp = hvision(state, measurementParameters);
else % Use radar model
zexp = hradar(state, measurementParameters);
end
The hvision and hradar measurement function should allow you to convert from the state in global coordinates to the expected measurement in vision and radar measurement space, respectively. The measurementParameters passed to those functions will define things like where the sensor is mounted, how it is oriented, etc. These are defined in the objectDetection as defined in https://www.mathworks.com/help/fusion/ug/convert-detections-into-objectDetection-format.html
The trackers will assign each sensor separately using the SensorIndex property in the objectDetections. Cost calculation will use the expected measurement from the function above.
Note that cost is:
(z-h(x))' / (R+HPH') * (z-h(x)) + log(det(R+HPH')).
The first part is Mahalonobis distance while the log(det()) part is a compensation to avoid assignment of tracks with large uncertainty (which causes HPH' to be large).
The tracker uses the measurement functions to update the track state in global coordinates. You don't need to do anything else other than defining the measurement functions correctly.
  1 Comment
Aatif Sulaimani
Aatif Sulaimani on 4 Aug 2023
Hi Elad,
Thank you for your answer. I am not using the measurementParameters of ObjectDetection to convert my measurements but use these measurement functions which have the intrinsic and extrinsic parameters of camera and radar. And I believe my measurement functions are fine but when I use them it creates tracks with camera and radar both even when they fall inside the same assignment gate. Could you let me know what else would I be doing wrong if the tracker works as you have mentioned.
Also, if you could explain how the track initialization works and tentative tracks are generated with this example it would be helpful to understand.
1) Say at time t1, I have 1 camera and 1 radar detection. Both should fall in the same gate. For the track initialization, will it initialize tracks for both or one? if one, which one?
2) At time t2, say the track was assigned to camera detection and updated and there is one radar detection which fall inside the gate.
3) At time t3, I have no radar detection, only camera detection.
4) At time t4, I have no camera detection, only radar detection and it should be assigned to radar detection. For this, I don't really understand how would the tracker switches to radar kalman filter model.
Is there any way to know if the tracks are associated to which sensors using the SensorIndex? Then it would be helpful to examine the tracker.
Thank you once again for your answers.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!