Feed Forward ANN Training using Back Propagation
2 views (last 30 days)
Show older comments
Find a new weights by using Matlab code, when shown below is presented the input pattern (1 2 3) and the target output is 1. Using learning rate of 0.2 and bipolar sigmoid activation function, the bias is set to1.
Find a new weights using back-propagation when the net shown below is presented the input patterns (1 -1 1) and the desired output is 1. Using learning rate of 0.3 and Unipolar continuous activation function .the bias is set to 1, 𝝀 = 𝟎. 𝟔.
0 Comments
Answers (1)
Shaik
on 12 May 2023
Hi Warren,
Can you check this code for your need once:
% Define the input and output patterns
P = [1 -1 1];
T = 1;
% Define the network architecture and parameters
nInputs = length(P);
nHidden = 1;
nOutputs = 1;
learningRate = 0.3;
bias = 1;
lambda = 0.6;
% Initialize the weights randomly
w1 = randn(nInputs,nHidden);
w2 = randn(nHidden,nOutputs);
% Define the activation functions
sigmoid = @(x) 1 ./ (1 + exp(-x));
dsigmoid = @(x) sigmoid(x) .* (1 - sigmoid(x));
unipolar = @(x) 0.5 * (1 + tanh(lambda * x/2));
dunipolar = @(x) lambda/2 * (1 - unipolar(x).^2);
% Initialize the delta arrays
delta2 = zeros(nOutputs,1);
delta1 = zeros(nHidden,1);
% Loop through the iterations of the backpropagation algorithm
for i = 1:100
% Forward pass
net1 = P * w1 + bias;
y1 = unipolar(net1);
net2 = y1 * w2 + bias;
y2 = unipolar(net2);
% Backward pass
delta2 = dunipolar(net2) * (T - y2);
delta1 = dunipolar(net1) .* (delta2 * w2');
% Update the weights
w2 = w2 + learningRate * y1' * delta2;
w1 = w1 + learningRate * P' * delta1;
end
% Test the network with the input pattern (1 -1 1)
net1 = P * w1 + bias;
y1 = unipolar(net1);
net2 = y1 * w2 + bias;
y2 = unipolar(net2);
% Print the output
fprintf('The predicted output is: %f\n', y2);
Hope it helps!
1 Comment
John D'Errico
on 31 May 2023
Please don't do obvioius homework assignments for students who cannot bother to show an effort of their own. This does not help the student, as it teaches them nothing but that there is always someone willing to do their work for them. Why you want to do that, is up to you, but it is a bad idea for the student.
It does not help the site, because it teaches this student that they can now post addigntional questions, and get their homework assignment done for them.
It hurts the site because it also convinces other students they they too can spam the site with their homework assignments.
This was the third question this student posted, with no effort made. Do you expect them to learn anything more than to keep on posting their homework?
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!