Cross-Correlation to determine lag

I have to vectors that contain the time points a signal was recorded (in seconds). The first vector contains the time points a playback was triggered (around every 12s). The other vector contains the time points I received an answer to this playback. I'm trying to calculate the lag of this answer relative to the onset of this playback. I was trying with the xcorr function, but can't make much sense of it. Any advice how to go about this? And is it problematic that the second vector is shorter than the first (I didn't always receive an answer to the playback)?

6 Comments

Of course it would help people to help you if you had attached your two signals.
Thanks, I attached them now.
I am afraid those signals do not help to solve your question unless they have a timestamp.
I was probably a little unclear and "signal" might be the wrong word. The data in the files are the timestamps of the signal onset. I'm interested in the lag of the second variable relative to the first.
So in some cases the second signal comes before the first signal?
well, the first one is basically repeated every 12s(±2s) or every 6s (±1s). And in between the second one sometimes answers the first one. And I'm interested in the latency of that answer relative to the onset of the first one.

Sign in to comment.

Answers (0)

Asked:

on 7 Jul 2016

Commented:

on 7 Jul 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!