update weights and bias in neural network by sgdm

5 views (last 30 days)
How to update weights and bias in neural networks using stochastic gradient descent with momentum sgdm using equations?

Answers (1)

Balaji
Balaji on 27 Sep 2023
Hello Ahmed,
I understand you want to implement neural networks using stochastic gradient descent with momentum.
For which you have to :
  1. Initialize the values for weights, biases, learning rate, momentum coefficient, and other hyperparameters.
  2. Loop through the entire dataset or till convergence and perform the following operations iteratively:
  3. Forward pass, calculate the node values.
  4. Calculate the loss function and perform back propagation and update the gradient and momentum.
  5. Update the weights and biases.
Here is an example code :
% Initialize network parameters
learning_rate = 0.01;
momentum = 0.9;
num_epochs = 100;
% Initialize weights and biases
weights = randn(2, 1); % Example: 2 input neurons
biases = randn(1);
% Initialize momentum terms
prev_delta_weights = zeros(size(weights));
prev_delta_biases = zeros(size(biases));
% Iterate through the training data
for epoch = 1:num_epochs
% Perform forward pass and backpropagation for each training sample
% Sample input and target output
input = randn(2, 1);
target_output = 0.5;
% Forward pass
output = weights' * input + biases;
% Calculate error
error = output - target_output;
% Backpropagation
gradient_weights = input * error;
gradient_biases = error;
% Update gradients with momentum
delta_weights = learning_rate * gradient_weights + momentum * prev_delta_weights;
delta_biases = learning_rate * gradient_biases + momentum * prev_delta_biases;
% Update weights and biases
weights = weights - delta_weights;
biases = biases - delta_biases;
% Update momentum terms
prev_delta_weights = delta_weights;
prev_delta_biases = delta_biases;
end

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!