Cross-Correlation to determine lag

I have to vectors that contain the time points a signal was recorded (in seconds). The first vector contains the time points a playback was triggered (around every 12s). The other vector contains the time points I received an answer to this playback. I'm trying to calculate the lag of this answer relative to the onset of this playback. I was trying with the xcorr function, but can't make much sense of it. Any advice how to go about this? And is it problematic that the second vector is shorter than the first (I didn't always receive an answer to the playback)?

6 commentaires

Image Analyst
Image Analyst le 7 Juil 2016
Of course it would help people to help you if you had attached your two signals.
Zenid
Zenid le 7 Juil 2016
Thanks, I attached them now.
José-Luis
José-Luis le 7 Juil 2016
I am afraid those signals do not help to solve your question unless they have a timestamp.
Zenid
Zenid le 7 Juil 2016
I was probably a little unclear and "signal" might be the wrong word. The data in the files are the timestamps of the signal onset. I'm interested in the lag of the second variable relative to the first.
José-Luis
José-Luis le 7 Juil 2016
So in some cases the second signal comes before the first signal?
Zenid
Zenid le 7 Juil 2016
well, the first one is basically repeated every 12s(±2s) or every 6s (±1s). And in between the second one sometimes answers the first one. And I'm interested in the latency of that answer relative to the onset of the first one.

Connectez-vous pour commenter.

Réponses (0)

Question posée :

le 7 Juil 2016

Commenté :

le 7 Juil 2016

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by