I have two plots. These two plots follow the same trend but are slighly variable. The X axis denotes time for this graph. I would like to determine the minimum amount of time (number of points along the x axis) in which the %difference between the two plots is below a certain value. I would like this to be repeatable along any point along the graphs.
My thoughts are to randomly select a certain amount of points along the x axis (number of points selected determines the amount of time that is being tested (ex: 100 points = 1 second) and then check the percent difference. Then repeat that process until statistically the %difference is accturate for the plot.
Does anyone have any idea of how to randomly sample points on two graphs along the x axis, determine the percent differnce for those two points, and then average all of that together to determine the total % differnce?
I apologize if the wording is confusion and appreciate any input! thanks!