0% found this document useful (0 votes)
5 views

Assignment 4

This document outlines an assignment focused on forward and backward propagation calculations in a simple neural network with 2 input neurons, 2 hidden neurons, and 1 output neuron. It provides specific tasks including calculations for forward propagation, error calculation, backward propagation, and weight updates, along with initial parameters and learning rate. Submission requirements emphasize detailed calculations, rounding, explanations, and a computational graph.

Uploaded by

Shaharyar Asif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Assignment 4

This document outlines an assignment focused on forward and backward propagation calculations in a simple neural network with 2 input neurons, 2 hidden neurons, and 1 output neuron. It provides specific tasks including calculations for forward propagation, error calculation, backward propagation, and weight updates, along with initial parameters and learning rate. Submission requirements emphasize detailed calculations, rounding, explanations, and a computational graph.

Uploaded by

Shaharyar Asif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Neural Network Assignment

Forward and Backward Propagation Calculations


Problem Description
In this assignment, you will work with a simple neural network with:
• 2 input neurons (x₁, x₂)
• 2 hidden neurons (h₁, h₂)
• 1 output neuron (y)
• Sigmoid activation function: σ(x) = 1/(1 + e⁻ˣ)

Given Parameters
Initial weights:
• Input to Hidden Layer:
• w₁₁ = 0.15
• w₁₂ = 0.20
• w₂₁ = 0.25
• w₂₂ = 0.30
• Hidden to Output Layer:
• v₁ = 0.40
• v₂ = 0.45
Input values:
• x₁ = 0.05
• x₂ = 0.10
Target output:
• t = 0.01
Learning rate:
• η = 0.5

Task 1: Forward Propagation


Calculate the following steps:
1. Calculate the input to hidden layer neurons:
• net_h₁ = w₁₁x₁ + w₂₁x₂
• net_h₂ = w₁₂x₁ + w₂₂x₂
2. Apply activation function to get hidden layer outputs:
• h₁ = σ(net_h₁)
• h₂ = σ(net_h₂)
3. Calculate the input to output layer neuron:
• net_y = v₁h₁ + v₂h₂
4. Calculate final output:
• y = σ(net_y)

Task 2: Error Calculation


Calculate the Mean Squared Error:
• E = ½(t - y)²

Task 3: Backward Propagation


Calculate the following derivatives:
1. Output Layer:
• ∂E/∂y = -(t - y)
• ∂y/∂net_y = y(1 - y)
• ∂net_y/∂v₁ = h₁
• ∂net_y/∂v₂ = h₂
2. Hidden Layer:
• ∂E/∂h₁ = ∂E/∂y × ∂y/∂net_y × ∂net_y/∂h₁
• ∂E/∂h₂ = ∂E/∂y × ∂y/∂net_y × ∂net_y/∂h₂
• ∂h₁/∂net_h₁ = h₁(1 - h₁)
• ∂h₂/∂net_h₂ = h₂(1 - h₂)

Task 4: Weight Updates


Calculate the new weights:
1. Output Layer:
• v₁_new = v₁ - η × ∂E/∂v₁
• v₂_new = v₂ - η × ∂E/∂v₂
2. Hidden Layer:
• w₁₁_new = w₁₁ - η × ∂E/∂w₁₁
• w₁₂_new = w₁₂ - η × ∂E/∂w₁₂
• w₂₁_new = w₂₁ - η × ∂E/∂w₂₁
• w₂₂_new = w₂₂ - η × ∂E/∂w₂₂

Submission Requirements
1. Show all calculations step by step
2. Round all intermediate calculations to 4 decimal places
3. Include a clear explanation for each step
4. Draw the computational graph showing the forward and backward passes

You might also like