Deep Learning in custom training loop with output layer replacement during training

4 views (last 30 days)
Hi there!
I'm working on SGAN network that can classify unlabeled data set, and I found the excellent summary article, here
One of the SGAN examples assumes that I will use lambda function to replace (add) additional output layer during training loop for each iteration, something like this (in the article, look at the "Stacked Discriminator Models With Shared Weights" description):
  • Train supervised network (multiple classes - cross-enthropy output)
  • Add to supervised network sigmoid output
  • Train unsupervised network (with new additiona output layer)
In the article and original paper they use Keras and lambda function for output layer, how to make that change in network in Matlab during custom training loop? Changing layer during training reduce performance dramatically (replaceLayer function), I'm looking for some lambda analog for that type of training.
Thank you!
p.s. attached screenshot of SGAN training from book "GANs in Action".
  1 Comment
Kirill
Kirill on 29 Mar 2022
Edited: Kirill on 29 Mar 2022
There is an example on two outputs networks, it can be used I think, but it will be different approach than explaned in book and article: documentation

Sign in to comment.

Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!