2 views (last 30 days)

Hello,

I have two plots. These two plots follow the same trend but are slighly variable. The X axis denotes time for this graph. I would like to determine the minimum amount of time (number of points along the x axis) in which the %difference between the two plots is below a certain value. I would like this to be repeatable along any point along the graphs.

My thoughts are to randomly select a certain amount of points along the x axis (number of points selected determines the amount of time that is being tested (ex: 100 points = 1 second) and then check the percent difference. Then repeat that process until statistically the %difference is accturate for the plot.

Does anyone have any idea of how to randomly sample points on two graphs along the x axis, determine the percent differnce for those two points, and then average all of that together to determine the total % differnce?

I apologize if the wording is confusion and appreciate any input! thanks!

Sindar
on 22 Jul 2020

Assuming your two datasets share x values and are not absurdly massive, this direct method is probably faster

% generate some sample data

x = 1:100;

y1 = rand(1,100);

y2 = rand(1,100);

% set a threshold of 5%

p_thresh = 0.05;

% calculate the percent difference between sets for each point

y_pdiff = y1./y2-1;

% calculate the cumulative mean

y_pdiff_mean = movmean(y_pdiff, [length(y_pdiff)-1 0]);

% find the first time the cumulative mean is less than your threshold

idx = find(y_pdiff_mean<p_thresh,1,'first');

% find the x-location corresponding to this point

x_thresh = x(idx);

Sindar
on 27 Jul 2020

I'm a bit confused. Why would a larger section necessarily have a lower % difference?

(I assumed you had two signals that eventually converged)

Opportunities for recent engineering grads.

Apply TodayFind the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
## 0 Comments

Sign in to comment.