MATLAB Answers

How to add labell input to the custom Intermediate-Layer of deep learning?

1 view (last 30 days)
cui
cui on 22 Nov 2020
Edited: cui on 22 Nov 2020
classdef myLayer < nnet.layer.Layer
properties
% (Optional) Layer properties.
% Layer properties go here.
end
properties (Learnable)
% (Optional) Layer learnable parameters.
% Layer learnable parameters go here.
end
methods
function layer = myLayer()
% (Optional) Create a myLayer.
% This function must have the same name as the class.
% Layer constructor function goes here.
end
function [Z1, , Zm] = predict(layer, X1, , Xn)
% Forward input data through the layer at prediction time and
% output the result.
%
% Inputs:
% layer - Layer to forward propagate through
% X1, ..., Xn - Input data
% Outputs:
% Z1, ..., Zm - Outputs of layer forward function
% Layer forward function for prediction goes here.
end
function [Z1, , Zm, memory] = forward(layer, X1, , Xn)
% (Optional) Forward input data through the layer at training
% time and output the result and a memory value.
%
% Inputs:
% layer - Layer to forward propagate through
% X1, ..., Xn - Input data
% Outputs:
% Z1, ..., Zm - Outputs of layer forward function
% memory - Memory value for custom backward propagation
% Layer forward function for training goes here.
end
function [dLdX1, , dLdXn, dLdW1, , dLdWk] = ...
backward(layer, X1, , Xn, Z1, , Zm, dLdZ1, , dLdZm, memory)
% (Optional) Backward propagate the derivative of the loss
% function through the layer.
%
% Inputs:
% layer - Layer to backward propagate through
% X1, ..., Xn - Input data
% Z1, ..., Zm - Outputs of layer forward function
% dLdZ1, ..., dLdZm - Gradients propagated from the next layers
% memory - Memory value from forward function
% Outputs:
% dLdX1, ..., dLdXn - Derivatives of the loss with respect to the
% inputs
% dLdW1, ..., dLdWk - Derivatives of the loss with respect to each
% learnable parameter
% Layer backward function goes here.
end
end
end
issue:
function [Z1, …, Zm] = predict(layer, X1, …, Xn), how to define Xn in this form as the input label instead of the default convolution activation feature map?
Even adding the onehot tag will not work, because there are zeros in onehot, the checkLayer test will report an error, which feels very painful...

Answers (0)

Products


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!