MATLAB Answers

Why are the results of forward and predict very different in deep learning?

8 views (last 30 days)
cui
cui on 20 Jun 2020
Answered: cui on 12 Jul 2020
When I use the "dlnetwork" type deep neural network model to make predictions, the results of the two functions are very different, except that using the predict function will freeze the batchNormalizationLayer and dropout layers.While forward does not freeze the parameters, he is the forward transfer function used in the training phase.
From the two pictures above, there are orders of magnitude difference in the output of the previous 10 results. Where does the problem appear?
All my data is here.

  0 Comments

Sign in to comment.

Answers (2)

vaibhav mishra
vaibhav mishra on 30 Jun 2020
Hi there,
In my opinion you are using BatchNorm in training and not in testing, so how can you expect to get the same results from both. You need to use batchnorm in testing also with the same parameters as training.

  1 Comment

cui
cui on 7 Jul 2020
Thank you for your reply! But isn't the method function predict of dlnetwork freeze the BatchNorm mean and variance during model inference?
1、If it is frozen BN, why is the second parameter state returned by predict empty?
2、in testing, If I want to use Batchnorm parameters in the training phase, how should the code be modified during the inference model?
Sincerely hope to get your reply, thank you!

Sign in to comment.


cui
cui on 12 Jul 2020
I wrote an analysis blog on this issue, see the attachment link. The question that still bothers me is how does batchnorm() forward and predict?

  0 Comments

Sign in to comment.

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!