connecting concenation layer error

5 vues (au cours des 30 derniers jours)
Fatih
Fatih le 2 Fév 2025
Commenté : Fatih le 5 Fév 2025
Hello everyone. I have an issue. In the following code, I cant connect concatenationLayer = concat to featureAttention & temporalAttention. Would you please help?
Error Message
Caused by:
Layer 'concat': Unconnected input. Each layer input must be connected to the output of another layer.
++
numFeatures = size(XTrain, 2);
numClasses = numel(categories(YTrain));
% Feature-Level Attention
featureAttention = [
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
];
% Temporal Attention (not used for Iris dataset, but included for completeness)
temporalAttention = [
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
];
% Combine into Hierarchical Attention
hierarchicalAttention = [
featureInputLayer(numFeatures, 'Name', 'input_features') % Input layer for features
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat') % Concatenate feature and temporal attention outputs
];
  1 commentaire
Matt J
Matt J le 2 Fév 2025
Modifié(e) : Matt J le 2 Fév 2025
We cannot demonstrate solutions, because you do not provide numFeatures or numClasses. They are generated from XTrain and YTrain, which we do not have.

Connectez-vous pour commenter.

Réponse acceptée

Matt J
Matt J le 2 Fév 2025
Modifié(e) : Matt J le 2 Fév 2025
Use connectLayers to make your connections programmatically or make the connections manually in the deepNetworkDesigner.
  2 commentaires
Matt J
Matt J le 2 Fév 2025
Modifié(e) : Matt J le 3 Fév 2025
If you have R2024a, you can simplify things a bit with networkLayer:
% Feature-Level Attention Block (Encapsulated)
featureAttention = networkLayer([
fullyConnectedLayer(64, 'Name', 'fc_feature_attention')
reluLayer('Name', 'relu_feature_attention')
fullyConnectedLayer(1, 'Name', 'fc_feature_weights')
softmaxLayer('Name', 'feature_attention_weights')
], 'Name', 'feature_attention_block');
% Temporal Attention Block (Encapsulated)
temporalAttention = networkLayer([
fullyConnectedLayer(numFeatures, 'Name', 'input_sequence')
lstmLayer(64, 'OutputMode', 'sequence', 'Name', 'lstm_temporal_attention')
fullyConnectedLayer(1, 'Name', 'fc_temporal_weights')
softmaxLayer('Name', 'temporal_attention_weights')
], 'Name', 'temporal_attention_block');
% Create a layerGraph with multiple layers but NO connections yet
hierarchicalAttention = layerGraph([
featureInputLayer(numFeatures, 'Name', 'input_features');
featureAttention
temporalAttention
concatenationLayer(1, 2, 'Name', 'concat_attention');
]);
% Connect the layers
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'feature_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'input_features', 'temporal_attention_block');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'feature_attention_block', 'concat/in1');
hierarchicalAttention = connectLayers(hierarchicalAttention, 'temporal_attention_block', 'concat/in2');
Fatih
Fatih le 5 Fév 2025
thanks alot. It worked, it seems I was just going a bit blind. Now it is solved once I checked it thoroughly.

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by