Downsampling strain measurement data to save memory space
Show older comments
I currently have a few hundred gigabytes of strain measurement data, which has been stored in .mat files. Each of the files includes around a hundred million datapoints. The measurement data has been collected at a sampling rate of 60 Hz. As my area of interest lies in the 10 Hz frequency range, I would be interested in downsamping the data to 20 Hz efficiently and effectively to save some limited memory space. I have noticed that people have used functions such as
%for example
original = 1:102;
%%
downsampled1 = interp1(1:(length(original)), original, linspace(1,(length(original)), (length(original)/3)))
%%
downsampled2 = downsample(original,3)
%%
downsampled3 = resample(original, 1 ,3)
%%
downsampled4 = spline(original,original,linspace(1,(length(original)), (length(original)/3)))
%%
downsampled5 = decimate(original,3)
, but I am at a loss when it comes to understanding the downsides of each of them.
What would be a good and efficient solution for downsampling such big data? Would it be wiser to design my own lowpass filter to simulate what resampling and decimate function essentially do?
I would be grateful for any advice or lines of code for tackling the matter.
Thanks
Accepted Answer
More Answers (0)
Categories
Find more on Multirate Signal Processing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!