1. Scalar Function of a Vector
Description:
You have a scalar function , where
. The derivative is called the gradient, and it’s a column vector.
Formula:
Example:
Let
This is just the dot product of a vector with itself, so it gives a scalar.
Let’s say , so:
Now we compute the gradient , which is a vector of partial derivatives:
Step-by-step partials:
Now collect into a column vector:
So we’ve explicitly calculated that the gradient of is
.
2. Vector Function of a Vector (Jacobian)
Description:
You have a function . Its derivative is the Jacobian matrix, which contains all partial derivatives
.
Formula:
Example:
Let
Then the Jacobian is:
3. Matrix Formulas in Vector Calculus
Example:
Let’s say you have:
where is a constant symmetric matrix.
Then:
If
? Linear Regression: Vector Derivative Example
? Goal:
Given data (input features), and target
, find weights
that minimize:
This is the least squares loss function.
Step 1: Expand the function
Step 2: Take gradient with respect to
Use the identity:
So:
This gives the gradient (vector of partials) with respect to the weights .
Step 3: Solve for optimal weights (closed-form solution)
Set gradient to zero:
Discover more from Science Comics
Subscribe to get the latest posts sent to your email.