connecting concenation layer error

5 views (last 30 days)
Fatih
Fatih on 2 Feb 2025
Commented: Fatih on 5 Feb 2025
Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
];
% Combine into Hierarchical Attention
hierarchicalAttention = [
featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
];
  1 Comment
Matt J
Matt J on 2 Feb 2025
Edited: Matt J on 2 Feb 2025
We cannot demonstrate solutions, because you do not provide numFeatures or numClasses. They are generated from XTrain and YTrain, which we do not have.

Sign in to comment.

Accepted Answer

Matt J
Matt J on 2 Feb 2025
Edited: Matt J on 2 Feb 2025
Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.
  2 Comments
Matt J
Matt J on 2 Feb 2025
Edited: Matt J on 3 Feb 2025
If you have R2024a, you can simplify things a bit with networkLayer:
% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
featureInputLayer(numFeatures, 'Name', 'input_features');
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
Fatih
Fatih on 5 Feb 2025
thanks alot. It worked, it seems I was just going a bit blind. Now it is solved once I checked it thoroughly.

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!