How to average time data every 1/4 of hour?

2 views (last 30 days)
Hello Community, I have a problem to average data every quarter of hour. I have to array of the same length. Vector "t" shows the time the data was recorded and vector "x" shows the data values for every t. The data in "x" does not have the same period of recording, generally is every 1 min, but it can jump randomly, for instance, the period between points can change to 4,8 50, 140 minutes. I'm trying to create a loop that averages the data at specific intervals in an hour.
x=[42,15,4,1,5,9,84,7,45,55,77,5,15,...]
t=[01-Jan-2016 22:24:00,01-Jan-2016 22:25:00,01-Jan-2016 22:26:00,01-Jan-2016 23:00:00,01-Jan-2016 23:04:00,01-Jan-2016 23:04:00,....)
So, I'm looking to generate a loop that check if the data is between 0-15 min,15-30 min, 30-45 min,45-0 min and then average the corresponding data of x.
Is there a much better suggestion on how to proceed?
Thanks for all the help

Accepted Answer

Walter Roberson
Walter Roberson on 10 Aug 2017
The easiest way is probably to use a timetable() object and call retime(). This requires R2016b or later.
  1 Comment
Jorge Rodriguez
Jorge Rodriguez on 14 Aug 2017
Thank you Walter, I'm very new at matlab and I think retime can help me a lot.

Sign in to comment.

More Answers (1)

Peter Perkins
Peter Perkins on 14 Aug 2017
Walter is right that retime on a timetable makes this almost a one-liner.
But even pre-R2016b without timetables, you don't want a loop. Use discretize to bin the times, and then use something like splitapply to compute averages in each bin. If you have t and x in a table, you can add the bins to the table, and use varfun with them as a grouping variable.
>> x = [42;15;4;1;5;9];
>> t = {'01-Jan-2016 22:24:00';'01-Jan-2016 22:25:00';'01-Jan-2016 22:26:00';'01-Jan-2016 23:00:00';'01-Jan-2016 23:04:00';'01-Jan-2016 23:04:00'};
>> t = datetime(t);
>> edges = datetime(2016,1,1,22,0:15:120,0)';
>> bin = discretize(t,edges,edges(1:end-1))
bin =
6×1 datetime array
01-Jan-2016 22:15:00
01-Jan-2016 22:15:00
01-Jan-2016 22:15:00
01-Jan-2016 23:00:00
01-Jan-2016 23:00:00
01-Jan-2016 23:00:00
>> data = table(t,x,bin)
data =
6×3 table
t x bin
──────────────────── ── ────────────────────
01-Jan-2016 22:24:00 42 01-Jan-2016 22:15:00
01-Jan-2016 22:25:00 15 01-Jan-2016 22:15:00
01-Jan-2016 22:26:00 4 01-Jan-2016 22:15:00
01-Jan-2016 23:00:00 1 01-Jan-2016 23:00:00
01-Jan-2016 23:04:00 5 01-Jan-2016 23:00:00
01-Jan-2016 23:04:00 9 01-Jan-2016 23:00:00
>> varfun(@mean,data,'GroupingVariable','bin','InputVariable','x')
ans =
2×3 table
bin GroupCount mean_x
──────────────────── ────────── ──────
01-Jan-2016 22:15:00 3 20.333
01-Jan-2016 23:00:00 3 5
  2 Comments
Jorge Rodriguez
Jorge Rodriguez on 14 Aug 2017
Hi Peter, Thank you for your help. I just have a question: if I create bins for every group of data, would it consume a lot of memory if working with thousands of groups?
Peter Perkins
Peter Perkins on 15 Aug 2017
It would presumably use a lot less memory than you are already using for you raw data. I can't tell you wether it will work or not because I don't know how much RAM you have or how big your data are.

Sign in to comment.

Categories

Find more on Data Preprocessing in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!