Classification of a matrix and binary mask

1 view (last 30 days)
nomad nomad
nomad nomad on 29 Apr 2011
Hello everyone,
With the aim to optimize the ability to discriminate between two peaks (apes obtained a correlation between a scene containing two letters E and F, and the target that has the letter F) and from a theoretical calculation, I came to this matrix has the form):
M (u, v) = | D (u, v) | cos (Phi (d) - Phi (t))-epsilon | T (u, v) |
with D (u, v) is the FT of the letter E centered;
T (u, v) is the FT of the letter F-centered;
Phi (d) the phase of E centered and Phi (t) is the F center.
* The summation matrix (u, v = 0 to image size / 2)
1-I want to classify this way matrix, where the sum is greater than 0, must I eliminate from the positive area, and if the amount is less than 0, I eliminate from the negative region.
So run the program until you have a sum equal to 0. 2 - How to code binary mask that blocks certain frequencies to optimize the correlation and on what basis, I have to do this?
Thank you in advance

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!