Clear Filters
Clear Filters

how can I make a deep learning custom layer to take multiple inputs like addition/concatenation?

4 views (last 30 days)
I am inplementating attention mechanism for image segmentation, one step in attention is to take dot product of the attention coefficient and input feature map,however, matlab neural network toolbox doesn't support such an action yet. What I tried was to modify the "Addition layer" and relavant functions to make a new class/functions to handle dot product of two inputs and I also added those functions to the path. It kind of works so it does generate an "attention layer" and it can take two inputs, however, when I train the network, it gets error saying Error using nnet.internal.cnn.analyzer.NetworkAnalyzer/propagateSizes (line 223): Index exceeds array bounds.
Markus Wagner
Markus Wagner on 2 Jan 2019
I have the same problem. I want implementate a custome hidden layer and a custome regression layer with 2 inputs like the addition/concatenation layer for bulid up a VAE Network.
After adapt the class of the Addition Layer and use in a layerGraph it failed by a Error:
Error using nnet.internal.cnn.util.validateLayersForLayerGraph (line 20)
Expected input to be one of these types:
Instead its type was latentLayer.
Hanxx Sakura
Hanxx Sakura on 17 Jan 2019
Edited: Hanxx Sakura on 17 Jan 2019
I think there is no official way to implement module with multiple inputs, but I finally accomplish it inspired by the Addition.m and AdditionLayer.m as follows:
  1. implement an internal layer (e.g., myAdd) like the "Addition" Class, such as, defining the variables, forward / backforward function;
  2. since the internal layer cannot be used in the layerGraph directly, wrap the internal layer by an external class (e.g., myAddLayer) as in "AdditionLayer.m";
  3. create an object of your module by using myAddLayer();
hope this can help~

Sign in to comment.

Answers (1)

Maksym Tymchenko
Maksym Tymchenko on 10 Mar 2023
Since release R2022b, MATLAB directly supports the attention mechanism that you are trying to implement.
Check the documentation of the attention function at the link below:


Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!