After groupedConvolution2dLayer in network branches a corresponding depthConcatenationLayer needed
2 views (last 30 days)
Show older comments
Hi, All!
I use groupedConvolution2dLayer in my network for processing in parallel 5 image fragments. So, I am using 5 groups of channels.
In my network I have three branches and I need to concatenate outputs of these branches. There is a depthConcatenationLayer for this purpose in Matlab:
However I need to concatenate layers in the right order: first channels of group #1 of all three branches, then channels of group #2 of all three branches, ..., and finally channels of group #5 of all three branches. How to do it? Matlab's depthConcatenationLayer does not allow to specify order of channels.
I tryed to create a custom grouped depth concatenation layer:
classdef myGroupedDepthConcatenationLayer < nnet.layer.Layer
properties
GroupNumber
end
methods
function layer = myGroupedDepthConcatenationLayer(groups, numinputs, name)
layer.Name = name;
layer.NumInputs = numinputs;
layer.Description = 'Custom grouped depth concatenation layer';
layer.GroupNumber = groups;
end
function Z = predict(layer,varargin)
X = varargin;
c = zeros(1, layer.NumInputs);
for i = 1:layer.NumInputs
s = size(X{i});
c(i) = s(3);
end
if length(s)<4
n=1;
else
n=s(4);
end
Z = X{1}; Z(s(1),s(2),sum(c),n)=0; % memory is allocated
ofset = 0;
for j=1:layer.GroupNumber
for i=1:layer.NumInputs
len = c(i)/layer.GroupNumber;
Z(:,:,ofset+1:ofset+len,:,:)=X{i}(:,:,(j-1)*len+1:j*len,:);
ofset=ofset+len;
end
end
end
end
end
Is it correct? How will the channels reordering in the custom layer affect backward propagation duiring network training?
How to distribute gradients between inputs correctly?
0 Comments
Answers (0)
See Also
Categories
Find more on Parallel and Cloud in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!