How to average time data every 1/4 of hour?
    3 views (last 30 days)
  
       Show older comments
    
    Jorge Rodriguez
 on 10 Aug 2017
  
    
    
    
    
    Commented: Peter Perkins
    
 on 15 Aug 2017
            Hello Community, I have a problem to average data every quarter of hour. I have to array of the same length. Vector "t" shows the time the data was recorded and vector "x" shows the data values for every t. The data in "x" does not have the same period of recording, generally is every 1 min, but it can jump randomly, for instance, the period between points can change to 4,8 50, 140 minutes. I'm trying to create a loop that averages the data at specific intervals in an hour.
    x=[42,15,4,1,5,9,84,7,45,55,77,5,15,...]
    t=[01-Jan-2016 22:24:00,01-Jan-2016 22:25:00,01-Jan-2016 22:26:00,01-Jan-2016 23:00:00,01-Jan-2016 23:04:00,01-Jan-2016 23:04:00,....)
So, I'm looking to generate a loop that check if the data is between 0-15 min,15-30 min, 30-45 min,45-0 min and then average the corresponding data of x.
Is there a much better suggestion on how to proceed?
Thanks for all the help
0 Comments
Accepted Answer
  Walter Roberson
      
      
 on 10 Aug 2017
        The easiest way is probably to use a timetable() object and call retime(). This requires R2016b or later.
More Answers (1)
  Peter Perkins
    
 on 14 Aug 2017
        Walter is right that retime on a timetable makes this almost a one-liner.
But even pre-R2016b without timetables, you don't want a loop. Use discretize to bin the times, and then use something like splitapply to compute averages in each bin. If you have t and x in a table, you can add the bins to the table, and use varfun with them as a grouping variable.
>> x = [42;15;4;1;5;9];
>> t = {'01-Jan-2016 22:24:00';'01-Jan-2016 22:25:00';'01-Jan-2016 22:26:00';'01-Jan-2016 23:00:00';'01-Jan-2016 23:04:00';'01-Jan-2016 23:04:00'};
>> t = datetime(t);
>> edges = datetime(2016,1,1,22,0:15:120,0)';
>> bin = discretize(t,edges,edges(1:end-1))
bin = 
  6×1 datetime array
   01-Jan-2016 22:15:00
   01-Jan-2016 22:15:00
   01-Jan-2016 22:15:00
   01-Jan-2016 23:00:00
   01-Jan-2016 23:00:00
   01-Jan-2016 23:00:00
>> data = table(t,x,bin)
data =
  6×3 table
             t              x             bin         
    ────────────────────    ──    ────────────────────
    01-Jan-2016 22:24:00    42    01-Jan-2016 22:15:00
    01-Jan-2016 22:25:00    15    01-Jan-2016 22:15:00
    01-Jan-2016 22:26:00     4    01-Jan-2016 22:15:00
    01-Jan-2016 23:00:00     1    01-Jan-2016 23:00:00
    01-Jan-2016 23:04:00     5    01-Jan-2016 23:00:00
    01-Jan-2016 23:04:00     9    01-Jan-2016 23:00:00
>> varfun(@mean,data,'GroupingVariable','bin','InputVariable','x')
ans =
  2×3 table
            bin             GroupCount    mean_x
    ────────────────────    ──────────    ──────
    01-Jan-2016 22:15:00        3         20.333
    01-Jan-2016 23:00:00        3              5
2 Comments
  Peter Perkins
    
 on 15 Aug 2017
				It would presumably use a lot less memory than you are already using for you raw data. I can't tell you wether it will work or not because I don't know how much RAM you have or how big your data are.
See Also
Categories
				Find more on Logical in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

