Reading large excel file using "read" command takes ~5 minutes, is this expected performance?

1 view (last 30 days)
I am reading a simulation/test output data in a .xlsx file into a matlab table through a datastore variable. The test data contains 450+ variables, each with 20000+ samples (i.e.) 450+ columns and 20000+ rows but all are numbers. I created a datastore on the excel file, modified the selected variables and variable type properties and used read command to read the file into a matlab table, it took about ~5 minutes. When I tried readtable command on the excel file directly, it took about the same time as well. However when I tried reading the file interactively using matlab export dialog, it took less than 30 seconds, so I am wondering if there's any way to achieve the same level of efficiency programmatically?

Accepted Answer

J. Alex Lee
J. Alex Lee on 6 Sep 2020
Try manually creating the import options with spreadsheetimportoptions().
  2 Comments
Ajay Kumar
Ajay Kumar on 7 Sep 2020
Thanks. I will read up on this function and try it out. The format of the test output sheet will not change that often, but If it does, then I will have to update this object that I am gonna create?
J. Alex Lee
J. Alex Lee on 7 Sep 2020
Yes, the idea is to fully specify the import parameters so that they don't have to be auto-detected.

Sign in to comment.

More Answers (0)

Products


Release

R2017b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!