Neural Networks - Function approximation

9 vues (au cours des 30 derniers jours)
Aydin Ahmadli
Aydin Ahmadli le 8 Déc 2018
Problem is to suggest weights of a multi-layered neural network computing the function f(x1, x2) = 3 − x1 − x2, where x1, x2 are input bits (of value 0 or 1 each). The neurons of the network should use the sigmoidal transfer function with the slope 1 and they have biases. The topology of the network must be the following:
(a) two input neurons – inputs are bits (with value 0 or 1),
(b) two neurons in a single hidden layer, and
(c) two neurons in the output layer
Outputs of the network will be interpreted as two-bit binary number in the following way:
• output greater or equal to 0.5 will be considered as logical 1,
• output greater less than 0.5 will be considered as logical 0.
List weights and biases of all neurons and also a table with the actual outputs of the network (before rounding) for all four combinations of the input bits...
What is wrong with this code? :
clc;
in=[1 0 1 0
0 1 1 0];
out=[0.85 0.85 0.2 0.85
0.2 0.2 0.85 0.85];
net=feedforwardnet(2);
net = configure(net,in,out);
net.layers{1}.transferFcn='logsig';
net.layers{2}.transferFcn='logsig';
net.trainParam.epochs = 100;
net.trainParam.goal = 1e-6;
net = init(net);
[net,tr] = train(net, in, out)
weights = getwb(net)
weight=net.IW{1}
weight1=net.LW{2}
b1=net.b{1}
b2=net.b{2}

Réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by