I am running a model which produces thousands of csv files which i need to read into matlab. This particular run generated 27,178 files.
After 18839 files, matlab gave me an 'out of memory error'. Could anybody provide a solution or more a more effective way of coding this to allow all the files to be included?
Error using readtable (line 216)
Out of memory. Type "help memory" for your options.
filelist = dir('*.csv');
num_files = length(filelist);
[~, index] = natsort({filelist.name});
filelist = filelist(index);
particledata = cell(length(filelist), 1);
for a = 1:num_files
particledata{a} = readtable(filelist(a).name);
end
ymax = -0.13;
for b = 1:length(particledata)
y = particledata{b}(:,6);
y_array = table2array(y);
grains(b) = sum(y_array<ymax);
end