How to Optimize Kalman Filter Tracking for Specific Video

7 views (last 30 days)
I've been following this article to track the position of a red circle in my video. The circle stands out very well from the rest of the background, but the filter is having trouble detecting it. Is there anything I can do to optimize the code for my specific video (help it to detect the red marker)?
Here is one frame from the video I am analyzing and a warning I am receiving for the showDetections function:

Answers (1)

Image Analyst
Image Analyst on 15 Dec 2022
For this kind of image I'd detect the red spot using color segmentation, something like
[r,g,b] = imsplit(rgbImage);
mask = (r > 200) & (g < 200) & (b < 200); % Change values as needed.
mask = bwareafilt(mask, 1); % Take largest blob.
props = regionprops(mask, 'Centroid');
xCentroid = props.Centroid(1);
yCentroid = props.Centroid(2);
  2 Comments
Jeremy Hilton
Jeremy Hilton on 15 Dec 2022
Thanks! This code is so intricate I'm wondering what the best way is to implement it without damaging the rest of the code and corresponding functions. Also, I assume the 'rgbImage' is simply an image of one of the frames?
For further reference, this is some more lines of code that work to detect the circle (from the article):
%% READ VIDEO - DETECT OBJECTS - DISPLAY RESULTS
function utilities = createUtilities(param)
% Create System objects for reading video, displaying video, extracting
% foreground, and analyzing connected components.
utilities.videoReader = VideoReader('Tracker.mp4');
utilities.videoPlayer = vision.VideoPlayer('Position', [100,100,500,400]);
utilities.foregroundDetector = vision.ForegroundDetector(...
'NumTrainingFrames', 10, 'InitialVariance', param.segmentationThreshold);
utilities.blobAnalyzer = vision.BlobAnalysis('AreaOutputPort', false, ...
'MinimumBlobArea', 70, 'CentroidOutputPort', true);
utilities.accumulatedImage = 0;
utilities.accumulatedDetections = zeros(0, 2);
utilities.accumulatedTrackings = zeros(0, 2);
end
Elad Kivelevitch
Elad Kivelevitch on 15 Dec 2022
Hi Jeremy,
There are two separate steps that may need improvement, and I am trying to understand which step it is.
The first step is detecting the red dot on the video frame and the second step is tracking the red dot using the measurement provided by the first step.
Detector Step
To understand which of the steps does not work well in this case, let's start with the first part - detection. You may want to run the video frame-by-frame by adding a pause at the end of displaying each frame. Use the output from your detector (it seems like you're using a blob detection algorithm) to annotate the frame with the detection bounding box. This will allow you to visualize whether the red dot is indeed detected at each frame (encircled by a bounding box) or not (no bounding box).
If the detection step is not working correctly (dot is not detected most of the frames), you need to improve the detection prcoess. I am not a computer vision expert, so I will leave that to folks that know better than me.
Tracking Step
If all goes well in the detection step (dot is detected most of the frames) then the next step is to see whether the Kalman filter is working correctly. To do that, collect all the measurements of the box centroid from the detection step in an array:
pos = [x1 y1; % Measurements from frame 1
x2 y2; % Measurements from frame 2
...
xN yN]; % Measurements from frame N
Configure a Kalman filter, e.g., trackingKF. The rest of the code assumes you're using trackingKF.
kf = trackingKF("MotionModel", "2D Constant Velocity");
kf.State = [pos(1,1);0;pos(1,2);0]; % Initialize the state based on the first measurement
Unrecognized function or variable 'x1'.
kf.StateCovariance = diag([100 1e4 100 1e4]); % Initialize the state covariance
kf.ProcessNoise = eye(4); % A guess about how well the dot follows a constant velocity model
kf.MeasurementNoise = 100*eye(2); % A guess about how well the dot is detected
% Assuming utilities.videoReader is your videoReader
ind = 1; % Initialize the frame
while isFrame(utilities.videoReader)
ind = ind+1;
% Read video frame
predict(kf); % Predict the filter state to the next frame
correct(kf,[pos(ind,:)']); % Correct with the measurement at frame ind
% annotate video with the centroid of the Kalman filter kf.State([1 3]);
pause % To observe how well the centroid follows the dot
end
Depending on frame rate, type of motion that the dot is following, or level of noise in the measurements, you may need to modify the trackingKF to work better with your particular case.

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!