Matrix Calculus - Linear Algebra
Card 1 of 312
True or False, the Constrained Extremum Theorem only applies to skew-symmetric matrices.
True or False, the Constrained Extremum Theorem only applies to skew-symmetric matrices.
Tap to reveal answer
It only applies to symmetric matrices, not skew-symmetric ones. The Constrained Extremum Theorem concerns the maximum and minimum values of the quadratic form
when
.
It only applies to symmetric matrices, not skew-symmetric ones. The Constrained Extremum Theorem concerns the maximum and minimum values of the quadratic form when
.
← Didn't Know|Knew It →
The maximum value of a quadratic form
(
is an
symmetric matrix,
) corresponds to which eigenvalue of
?
The maximum value of a quadratic form (
is an
symmetric matrix,
) corresponds to which eigenvalue of
?
Tap to reveal answer
This is the statement of the Constrained Extremum Theorem. Likewise, the minimum value of the quadratic form corresponds to the smallest eigenvalue of
.
This is the statement of the Constrained Extremum Theorem. Likewise, the minimum value of the quadratic form corresponds to the smallest eigenvalue of .
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Which of the following expressions is one for the gradient of the determinant of an
matrix
?
Which of the following expressions is one for the gradient of the determinant of an matrix
?
Tap to reveal answer
The expression for the determinant of
using co-factor expansion (along any row) is

In order to find the gradient of the determinant, we take the partial derivative of the determinant expression with respect to some entry
in our matrix, yielding
.
The expression for the determinant of using co-factor expansion (along any row) is
In order to find the gradient of the determinant, we take the partial derivative of the determinant expression with respect to some entry in our matrix, yielding
.
← Didn't Know|Knew It →
Let
, and
, find the least squares solution for a linear line.
Let , and
, find the least squares solution for a linear line.
Tap to reveal answer
The equation for least squares solution for a linear fit looks as follows.

Recall the formula for method of least squares.


Remember when setting up the A matrix, that we have to fill one column full of ones.


To make things simpler, lets make
, and 


Now we need to solve for the inverse, we can do this simply by doing the following. We flip the sign on the off diagonal, and change the spots on the main diagonal, then we multiply by
.





The equation for least squares solution for a linear fit looks as follows.
Recall the formula for method of least squares.
Remember when setting up the A matrix, that we have to fill one column full of ones.
To make things simpler, lets make , and
Now we need to solve for the inverse, we can do this simply by doing the following. We flip the sign on the off diagonal, and change the spots on the main diagonal, then we multiply by .
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →