working with 1 TB or greater data files
5 views (last 30 days)
Show older comments
I would like to know what is the best way to work with very large data files that range from 1 TB to 30 TB. The files include in-phase and quadrature (IQ) data captured from a real-time signal analyzer. I am running Matlab 2014b on a Windows 8 64-bit computer with 64 GB of RAM.
I would like to be able to read in the data files to conduct basic signal processing and analysis such as FFTs and advanced routines more specific to RF analysis such as error vector magnitude, adjacent channel power ration, etc.
I am not familiar with Matlab's parallel computing capabilities or other 'bid data' capabilities such as mapreduce, memmapfile, or datastore.
Any information, feedback or suggestions as to recommended practices would be most welcome.
thanks, JimB
3 Comments
yashwanth annapureddy
on 19 Nov 2014
Yes, it would be good to know what type of files you are dealing with. datastore and mapreduce work with tabular text and mat files of a specific format.
Please do refer to the documentation of datastore and mapreduce and let us know for any questions using them.
Answers (1)
Darek
on 14 Nov 2014
Don't use Matlab. It's a waste of your time. Use AWS Kinesis with Redshift.
0 Comments
See Also
Categories
Find more on Data Import and Analysis in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!