0% found this document useful (0 votes)
2 views

Unit-2

Uploaded by

Sunil Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Unit-2

Uploaded by

Sunil Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 53

Unit-2

(Vector Calculus): Differentiation of a unitary function of


a vector, Partial Differentiation and Gradient, Gradients
of vector valued functions and matrices, Higher order
derivatives, Multivariate Taylor’s Theorem. Back
propagation and automatic differentiation for neural
network and deep learning.
Partial Dierentiation and Gradients
Gradients of Vector-Valued Functions
Gradient based Problems
Higher Order Partial Derivatives
Taylor’s based Problems
Backpropagation and Gradients
1. Identify intermediate functions (forward prop)
2. Compute local gradients
3. Combine with upstream error signal to get full gradient
Assume the neurons use the sigmoid activation function for the forward
and backward pass. The target output is 0.5, and the learning rate is 1.
After updating the weights, the
forward pass is repeated,
yielding:
•y3=0.57
•y4=0.56
•y5=0.61
• Dimension balancing is the “cheap” but efficient approach to gradient calculations in
most practical settings
• Read gradient computation notes to understand how to derive matrix expressions for
gradients from first principles
Thank you

You might also like