MATLAB Answers

how can one utilize a dropout layer in a neural network during prediction?

46 views (last 30 days)
Dino Bellugi
Dino Bellugi on 5 Feb 2020
Answered: Sourav Bairagya on 10 Feb 2020
I was hoping to use dropout layers at prediction time with an LSTM network in order to get confidence intervals.
Apparently, dropout layers only randomly set connections to 0 during training time.
From the dropout reference:
"A dropout layer randomly sets input elements to zero with a given probability. At training time, the layer randomly sets input elements to zero given by the dropout mask rand(size(X))<Probability, where X is the layer input and then scales the remaining elements by 1/(1-Probability). This operation effectively changes the underlying network architecture between iterations and helps prevent the network from overfitting. A higher number results in more elements being dropped during training. At prediction time, the output of the layer is equal to its input."
This explains why repeated calls to predictions with the same input result in the same output.
Has anyone come up with a workaround?
Thank you for your help,
-Dino

  0 Comments

Sign in to comment.

Answers (1)

Sourav Bairagya
Sourav Bairagya on 10 Feb 2020
Usually dropout layers are used during training to avoid overfitting of the neural network. Currenly, 'dropoutLayer' of 'Deep learning toolbox' doesn't performs dropout during prediction. If you want to use dropout during prediction, you can write a custom dropout layer which does dropout in both 'forward' and 'prediction' method.
You can leverage this link to get idea about writing custom layers:

  0 Comments

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!