Solution of large sparse matrix systems using GPU MLDIVIDE
Show older comments
I have a sparse 1 million by 1 million matrix system that I wish to solve repeatedly in a loop, (I use MLDIVIDE or the ' \ '). My CPU takes about 250 s to solve this once and the RAM goes up to 40 GB during the process. This has made me doubt that, Is it possible to solve this system on a 4 GB GPU, using the GPU MLDIVIDE? Will solving it on a GPU make it faster? Or it does not make sense? I have read that GPU is good for highly parallel operations. I have GeForce GTX 1050 Ti 4 GB GPU. My CPU is i7-9700 with 3 GHz, 8 cores, and 64 GB RAM.
Accepted Answer
More Answers (1)
Edric Ellis
on 16 Jan 2020
0 votes
A couple of suggestions:
- On the CPU, if you're repeatedly solving the same system, you might be able to benefit from the recently-introduced decomposition object.
- On the GPU, it's hard to say without knowing the exact details whether or not the GPU will be of benefit in this case, so perhaps it would be best to get a Parallel Computing Toolbox trial licence to enable you to experiment.
1 Comment
Yash Agrawal
on 16 Jan 2020
Edited: Yash Agrawal
on 16 Jan 2020
Categories
Find more on Parallel Computing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!