Neural network for regression task: Circuit modeling

4 vues (au cours des 30 derniers jours)
SIMONE PASQUARIELLO
SIMONE PASQUARIELLO le 19 Mai 2024
Hi everyone, i'm experimenting with using a DNN for modeling an electrical circuit. I have done many tries and until now i always used one sample for input( sequenceInputLayer(1) ) a one for output( since it is a regression task). Now, since the circuit i am modeling has memory, i would like to try to extend input lenght to count for more input sample at once but keeping my output as one single sample. I am facing problems in this beacuse i have my input_data vector that is 1167x1 so 1 column. This means that sequenceInputLayer expect always 1 as dimension and i am limited to this.
On the other hand i can transpose my vectors and use 1167 as input and output dimension, but I would like more freedom and try for example a NN with an input layer of size 300, then 2 hidden layes with size 50/100 (using tanh ecc..) and then my final output layer with output size 1 ( again, since this is a regression task). I already read MATLAB documentstion but i didn't find what i am searching for. This is the code i am using now, if you have suggestions i would be really happy!!!
Maybe i have to use mini batches? What changes i have to do in order to realize the network i said before?
(Yes, i know that for dynamical systems with memory CNN and RNN neural nets are better but in literature there are also papaer that use DNN for this type of modeling, i.e. using more input(past sample) in a DNN for taking into account memory effects).
%load Vout.mat %LTspice data
t = (0:5.145e-5:60e-3)'; %creates 1167 samples
input_data=10*sin(200*2*pi*t);
output_data=Vout;
% plot(input_data);hold on
% plot(output_data);
%B = reshape(input_data,[],4); %ottengo numero di colonne specifiche,4
% Creazione della rete neurale profonda
layers = [...
sequenceInputLayer(1)
fullyConnectedLayer(10)
reluLayer
%lstmLayer(100)
fullyConnectedLayer(10)
sigmoidLayer
fullyConnectedLayer(10)
tanhLayer
fullyConnectedLayer(1) % Output layer con una sola uscita
%tanhLayer
];
% Creazione della dlnetwork
net = dlnetwork(layers);
% Impostazione delle opzioni di addestramento
options = trainingOptions('adam', ...
'MaxEpochs',1000,...
'InitialLearnRate',1e-2, ...
'Verbose',false, ...
'Plots','training-progress');
% Addestramento della rete neurale
net = trainnet(input_data, output_data, net,"huber" , options);
%Predict
output_test = predict(net,input_data);

Réponses (1)

Yash Sharma
Yash Sharma le 27 Mai 2024
Hi Simone,
To adapt your current deep learning model to take into account the "memory" effect of your electrical circuit by using a sequence of inputs while still predicting a single output value, you can modify the network architecture and the way you prepare your input data. Since you're interested in using a DNN and have mentioned that your input data is a single column vector, you'll need to reshape your data to create sequences of inputs.
Here's a step-by-step approach to achieve this:
1. Reshape Your Input Data
Create sequences of 300 past samples as inputs for predicting the current output.
sequenceLength = 300;
numSamples = length(input_data) - sequenceLength;
X = zeros(sequenceLength, numSamples);
Y = zeros(1, numSamples);
for i = 1:numSamples
X(:, i) = input_data(i:i+sequenceLength-1);
Y(:, i) = output_data(i+sequenceLength);
end
2. Modify the Network Architecture
Change sequenceInputLayer to featureInputLayer with input size equal to your sequence length (300).
layers = [...
featureInputLayer(sequenceLength)
fullyConnectedLayer(50)
tanhLayer
fullyConnectedLayer(100)
tanhLayer
fullyConnectedLayer(1)
regressionLayer
];
3. Training the Network
Adjust your training process to handle the reshaped data. If using custom training loops, process the data as sequences of 300 samples.
4. Prediction
Ensure the input data for prediction is reshaped into sequences of 300 samples as well.
Find below the documentation of featureInputLayer
Hope this Helps!
  1 commentaire
SIMONE PASQUARIELLO
SIMONE PASQUARIELLO le 27 Mai 2024
Hi yash and thanks for answer me! Thank you for this idea, while i was waiting for an answer i thinked about this and tried it! Anyway i think some error could be present in your answer because i know that regressionLayer is no longer supported (and i tried it) so when i will call dlnetwork(layers) to create my actual neural net i will get an error

Connectez-vous pour commenter.

Catégories

En savoir plus sur Parallel and Cloud dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by