How to model a correcting function for shifted data?

2 views (last 30 days)
Hi all,
I have sensor reading that incorporates some time delays in its measurements, so that the measurement values of it are shifted from its ideal values (illustrated on the figure below). I have a set of data from both the ideal and measured. But, later I want to use a model that can 'correct' my measurement to 'estimate' the ideal value of it using only a single point of data (from one particular time step).
What technique should I use? All helps will be very much appreciated!
Thanks,
Ghazi

Answers (0)

Categories

Find more on Multivariate Models in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!