Clear Filters
Clear Filters

How to check if the parallel server GPU can handle a particular network

1 view (last 30 days)
I am running through a hyperparameter sweep of different network parameters. These include the number of LSTM layers, the number of hidden units in each layer, as well as a shape parameter as I call it (have the number of hidden units increase, decrease, or remain the same with subsequent layers).
I fully expect some of the setting to be too large to use the gpu. And I see this with the error returned regarding gpu memory.
What I would like to do is the following. Have a function compute how much gpu memory a network will take by looking at the layers, and then update the training options to use the cpu if it is too large. See the pseudo-code below where 'i' represent one set of hyperparameters.
How can I do this on my machine? Also, how can I do this on MATLAB Parallel Server?
if GPUmemory<NETmemory(layers{i})
trainingoptions{i}= ... 'ExecutionEnvironment','cpu' ...
trainingoptions{i}= ... 'ExecutionEnvironment','gpu' ...

Accepted Answer

Vatsal on 6 Oct 2023
Hi Miles Brim,
I understand that you are interested in creating a function that calculates the memory usage of a network based on its layers, and then updates the training options to utilize the CPU if the memory requirement is too high. To achieve this, you can utilize the "whos" function, which returns the network's memory in bytes. Additionally, to gather information about the available GPU devices and their respective memory, you can make use of the "gpuDevice" function.
You can also refer to the MATLAB documentation for "gpuDevice" to obtain more information on its usage and syntax. The link is provided below: -

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!