Matlab program out of memory on 64GB RAM Linux but not on 8GB RAM Windows
3 views (last 30 days)
Show older comments
I get unexpected out of memory errors when using Matlab 2013a-64bit on a supercomputer. My program uses no more than 5GB of memory, much less than is available on a supercomputer on either RAM or swap space. The RAM on one node of this cluster is 64GB. The same program runs fine on a personal Windows computer with just 8GB RAM. I am unable to check how much memory Matlab can use as command 'memory' is unavailable on Unix platform. My stack space is set to unlimited although I am not sure if this has any impact on Matlab. Could you offer me any assistance? Is there any way to check how much memory can Matlab use and if there is a way to expand it?
To be more specific I get out of memory errors when using "train" function of the neural network toolbox. The error path points to some function with "mex" in its name which stands for Matlab executable like a C program. I wonder if Matlab lets C run some calculations and C runs out of memory. I thought such a scenario would be prevented by my stack space being unlimited. If someone has more experience running Matlab on Linux from a terminal window, I will appreciate any advice.
2 Comments
Answers (0)
See Also
Categories
Find more on Introduction to Installation and Licensing in Help Center and File Exchange
Products
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!