How to model a correcting function for shifted data?
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi all,
I have sensor reading that incorporates some time delays in its measurements, so that the measurement values of it are shifted from its ideal values (illustrated on the figure below). I have a set of data from both the ideal and measured. But, later I want to use a model that can 'correct' my measurement to 'estimate' the ideal value of it using only a single point of data (from one particular time step).
What technique should I use? All helps will be very much appreciated!
Thanks,
Ghazi

0 commentaires
Réponses (0)
Voir également
Catégories
En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!