Speeding up data processing for large data set
8 views (last 30 days)
Show older comments
Hello,
Would anybody be able to help me to speed up this code? It simply processes data by looking for a change in table values (initTab.Val) from a 1 to a 0 then places the corresponding time value (init.Time) in a different table.
The datatypes for both values are double.
The table size could be upto 40e6 * 2
The table I'm currently working on is 6e6 *2
I had a little experiment to find if the time the code takes to execute varies linearly with the initial table size. a table height of 50000 took 7 seconds whereas it took nearly 2 minutes for 500000, so not quite linear.
My computer has plenty of CPU and RAM leftover, whilst running, only 30% of each is being used.
StartLine=20;
initTabHeight=height(initTab);
Tab2Height=initTabHeight/2;%find table length
Tab2SZ=[Tab2Height,2];%table dimensions:height by width
Tab2VarTypes=["double","double"];
Tab2=table('Size',Tab2SZ,'VariableTypes',Tab2VarTypes);%create table
PreVal=initTab.Val(StartLine);%defining 1st value to be compared in below for statement
Tab2LineNum=1;
for i= StartLine+1:initTabHeight %cuts off last line
CurrentVal=initTab.Val(i);
if PreVal > CurrentVal
Tab2{Tab2LineNum,1}=initTab.Time(i);
Tab2LineNum=Tab2LineNum+1;
end
PreVal=CurrentVal;
i
end
Tab2([Tab2LineNum:Tab2Height],:)=[];
1 Comment
Mathieu NOE
on 2 Dec 2020
hello
for loops are usually a bottleneck
you can chek that with the code profiler
would try to remove the for loop
Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!