# Time series, resample, axes, datetime

26 views (last 30 days)
ccs on 24 Feb 2015
Commented: dpb on 2 Mar 2015
I have energy consumption data of more than 3 years for every two seconds (43200 per day). Different households. How do I set the x axis into 24hrs (00:00 - 23:59) and plot each day data on one graph?
I have two variables (power and time). Files are saved as dates and have 43200 data each for power and time. I also have two variables called start_date and end_date. I would like to select any range of dates and get those dates data plotted in one graph. Later I will need to resample the time since, color each day plot, etc. I have attached a figure to explain more/less what im trying to do.
I highly appreciate your assistance in this. Thanks

dpb on 24 Feb 2015
Edited: dpb on 24 Feb 2015
Use date numbers...example for an axes...
>> dn=datenum(2000,1,1,0,0,[0:2:86400-1].'); % a day at 2-sec interval
>> t=dn-dn(1); % make 0-based
>> p=normpdf(t,0.5,.1); % some data to plot...
>> plot(t,p)
>> set(gca,'xtick',[0:2:24]/24) % set tick marks at 2-hr
>> datetick('x','HH:MM','keeplimits','keepticks') % format the time axis
>> set(gca,'xminortick','on') % minor ticks for odd hrs
>>
For the selection, convert the requested time ranges into date numbers, too, and use the logical operations to select between, inclusive or not, etc., ...
NB: When generating series of date numbers, always use integer multiples of the smaller time granule such as seconds above rather than dividing out the fraction of an hour or day and using floating point deltas. The latter will end up with floating point comparisons that do NOT compare owing to that rounding whereas the rounding will be consistent inside datenum and friends if use the integer deltas. As can be seen in the above example, the functions are "smart enough" to know how to wrap times when building the vector.
In your case, you'll be reading a time vector, presumably as string; simply convert it or ignore it and just build the time history as shown.
The example does not use the newer timeseries class; I don't have it in the version installed here. If choose to use it, some of the machinations are a little different but the ideas are the same. One thing you do have to be aware of is that there's a whole different set of formatting abbreviations between the two.

dpb on 2 Mar 2015
...'...task is to create a timeseries that Im able to select any range of dates and plot their power consumption IN ONE PLOT."
OK, my misunderstanding; I thought your intent was to plot the various days on the single plot as multiple lines overlaid to see the typical daily pattern, not as a single time series.
Similar then excepting you simply concatenate the times and data from each day into a long vector/array. You don't say how the date is encoded into the file names, but take whatever format that is and convert them to date numbers for the selection logic of which data you need to read. Or, alternatively, read each file and create a database that is the date number that corresponds to that date to use to select them. If the date stamp on the files corresponds to the actual date you could get clever and use the OS dir() command to select the ones but the TMW-implemented function in Matlab doesn't have all the niceties built into it, unfortunately.
But, a couple of hints...to select a range of dates between a start/end date the idea is to take the two dates from the user and do the selection...
d1=datenum(dStart,'formatStyleForInputDate');
d2=datenum(dEnd,'formatStyleForInputDate');
Presuming you've built the the aforementioned time vector then
idx=iswithin(dn,d1,d2);
wantedData=dataArray(idx,:);
will select those values that are within those ranges. Here, again, iswithin is my helper function
function flg=iswithin(x,lo,hi)
% returns T for values within range of input
% SYNTAX:
% [log] = iswithin(x,lo,hi)
% returns T for x between lo and hi values, inclusive
flg= (x>=lo) & (x<=hi);
that returns the logical addressing vector for those elements matching the condition.
To do the decimation, ask for or otherwise decide on the increment needed and convert that number to the number of two-second intervals. 1-min would obviously be every 30 values. So then the decision is whether you just select the points or averages or max or what...points are easy, simply use the above determined interval as the number in the colon expression and
wantedData=wantedData(1:Interval:end,:);
to select those points by row, all columns.
A neat "trick" in Matlab to do things like average over N terms is, since memory is by row-major storage, you can reshape an array as
wantedDataAvg=reshape(mean(reshape(wantedData,N,[])), ...
size(wantedData,2),[]));
This works by turning the data array into an array of number of rows on length over which want to average, does the average by column and then rearranging that row vector back to the number of original columns where each is now the average over the desired length.
Hopefully those are some clues...unfortunately, we're going to be leaving town tomorrow for a few days so it'll be what can do today until next week sometime for more...maybe somebody else can pick up or if you get stuck on another specific issue, post that as another question.
Good luck; it seems complicated but you'll get the hang of it here...
ccs on 2 Mar 2015
OK, my misunderstanding; I thought your intent was to plot the various days on the single plot as multiple lines overlaid to see the typical daily pattern, not as a single time series.
Wow...I think I have panicked already or english is also a problem. That is exactly what I intend to do. I don't want to concatenate them
dpb on 2 Mar 2015
Well, in that case we're back to the previous of "every day's the same" excepting for the base time. So, the selection is only on the data files, basically, and there the simple way would be to process all the dates once building a date number time series that spans the overall time of the available data and use that to look up the actual data.
We're heading out; I think if you just take the basic pieces we've talked about and start will get there..