0% found this document useful (0 votes)
113 views

Dama50 Vector Calculus-8

This document discusses partial differentiation and gradients in machine learning. The gradient is a generalization of the derivative to functions of multiple variables and is a row vector. A partial derivative is the derivative of a function with respect to one of its variables while keeping others constant. The gradient is defined as the vector of all partial derivatives of a function with respect to its variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
113 views

Dama50 Vector Calculus-8

This document discusses partial differentiation and gradients in machine learning. The gradient is a generalization of the derivative to functions of multiple variables and is a row vector. A partial derivative is the derivative of a function with respect to one of its variables while keeping others constant. The gradient is defined as the vector of all partial derivatives of a function with respect to its variables.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

DAMA50 - MATHEMATICS FOR MACHINE LEARNING

● Partial Differentiation and Gradients


Gradient: ∇x f or grad f is the generalization of the derivative to functions of
several variables, f(x) = f(x1, x2, …, xn). The gradient is a row vector.

Partial Derivative: The derivative of a function of several variables with respect


to one of its variables, keeping the others constant.
∂f(x) f(x1 + h, x2, …, xn) − f(x)
= lim
∂x1 h→0 h

∂f(x) f(x1, x2, …, xn + h) − f(x)
= lim
∂xn h→0 h

df ∂f(x) ∂f(x) ∂f(x)


∇x f = grad f = =[ … ]
dx ∂x1 ∂x2 ∂xn

You might also like