Operations and Properties - Linear Algebra
Card 1 of 2140
True or False: If
,
are square and invertible matrices then
is also invertible.
True or False: If ,
are square and invertible matrices then
is also invertible.
Tap to reveal answer
To prove
is invertible, we need to find another square matrix
such that
.
Since
exist, take
, then we have
,
and
.
Hence
is invertible.
To prove is invertible, we need to find another square matrix
such that
.
Since exist, take
, then we have
,
and
.
Hence is invertible.
← Didn't Know|Knew It →
Suppose that
is an invertible matrix. Simplify
.
Suppose that is an invertible matrix. Simplify
.
Tap to reveal answer
To simplify

we used the identities:


so we get

To simplify
we used the identities:
so we get
← Didn't Know|Knew It →
Suppose that
are all invertible. What is the inverse of
?
Suppose that are all invertible. What is the inverse of
?
Tap to reveal answer
The inverse of
is
since we can multiply it by
to get:


Therefore
is the inverse of 
The inverse of is
since we can multiply it by
to get:
Therefore is the inverse of
← Didn't Know|Knew It →
Determine the inverse of matrix A where

Determine the inverse of matrix A where
Tap to reveal answer
To determine the inverse of a matrix, you must first verify that the matrix is square. Next calculate the determinant. The determinant for this matrix is 0, so it does not have an inverse.
To determine the inverse of a matrix, you must first verify that the matrix is square. Next calculate the determinant. The determinant for this matrix is 0, so it does not have an inverse.
← Didn't Know|Knew It →
Tap to reveal answer
← Didn't Know|Knew It →
and
are both singular two-by-two matrices.
True or false:
must also be singular.
and
are both singular two-by-two matrices.
True or false: must also be singular.
Tap to reveal answer
To prove a statement false, it suffices to find one case in which the statement does not hold. We show that
and 
provide a counterexample.
A matrix is singular - that is, without an inverse - if and only if its determinant is equal to zero. The determinant of a two-by-two matrix is equal to the product of its upper left to lower right entries minus that of its upper right to lower left entries, so:


Both
and
are singular.
Now add the matrices by adding them term by term.




This is simply the two-by-two identity, which has an inverse - namely, itself.
The statement has been proved false by counterexample.
To prove a statement false, it suffices to find one case in which the statement does not hold. We show that
and
provide a counterexample.
A matrix is singular - that is, without an inverse - if and only if its determinant is equal to zero. The determinant of a two-by-two matrix is equal to the product of its upper left to lower right entries minus that of its upper right to lower left entries, so:
Both and
are singular.
Now add the matrices by adding them term by term.
This is simply the two-by-two identity, which has an inverse - namely, itself.
The statement has been proved false by counterexample.
← Didn't Know|Knew It →
and
are both two-by-two matrices.
has an inverse.
True or false: Both
and
have inverses.
and
are both two-by-two matrices.
has an inverse.
True or false: Both and
have inverses.
Tap to reveal answer
A matrix is nonsingular - that is, it has an inverse - if and only if its determinant is nonzero. Also, the determinant of the product of two matrices is equal to the product of their individual determinants. Combining these ideas:

If either
or
, then it must hold that
.
Equivalently, if either
or
has no inverse, then
has no inverse. Contrapositively, if
has an inverse, it must hold that each of
and
has an inverse.
A matrix is nonsingular - that is, it has an inverse - if and only if its determinant is nonzero. Also, the determinant of the product of two matrices is equal to the product of their individual determinants. Combining these ideas:
If either or
, then it must hold that
.
Equivalently, if either or
has no inverse, then
has no inverse. Contrapositively, if
has an inverse, it must hold that each of
and
has an inverse.
← Didn't Know|Knew It →
is an involutory matrix.
True, false, or indeterminate: 0 is an eigenvalue of
.
is an involutory matrix.
True, false, or indeterminate: 0 is an eigenvalue of .
Tap to reveal answer
An eigenvalue of an involutory matrix must be either 1 or
. This can be seen as follows:
Let
be an eigenvalue of involutory matrix
. Then for some eigenvector
,

Premultiply both sides by
:


By definition, an involutory matrix has
as its square, so



By transitivity,

Thus,
, or 
It follows that
. The statement is false.
An eigenvalue of an involutory matrix must be either 1 or . This can be seen as follows:
Let be an eigenvalue of involutory matrix
. Then for some eigenvector
,
Premultiply both sides by :
By definition, an involutory matrix has as its square, so
By transitivity,
Thus, , or
It follows that . The statement is false.
← Didn't Know|Knew It →
and
are both nonsingular two-by-two matrices.
True or false:
must also be nonsingular.
and
are both nonsingular two-by-two matrices.
True or false: must also be nonsingular.
Tap to reveal answer
We can prove that the sum of two nonsingular matrices need not be nonsingular by counterexample.
Let
,
.
A matrix is nonsingular - that is, with an inverse - if and only if its determinant is nonzero. The determinant of a two-by-two matrix is equal to the product of its upper left to lower right entries minus that of its upper right to lower left entries, so:


Both
and
are nonsingular.
Now add the matrices by adding them term by term.


,
the zero matrix, whose determinant is 0 and which is therefore not nonsingular.
We can prove that the sum of two nonsingular matrices need not be nonsingular by counterexample.
Let ,
.
A matrix is nonsingular - that is, with an inverse - if and only if its determinant is nonzero. The determinant of a two-by-two matrix is equal to the product of its upper left to lower right entries minus that of its upper right to lower left entries, so:
Both and
are nonsingular.
Now add the matrices by adding them term by term.
,
the zero matrix, whose determinant is 0 and which is therefore not nonsingular.
← Didn't Know|Knew It →
What is the dimension of the space spanned by the following vectors:



What is the dimension of the space spanned by the following vectors:
Tap to reveal answer
Since there are three linearly independent vectors, they span a 3 dimensional space.
Notice that the vectors each have 5 coordinates to them. Therefore they actually span a 3 dimensional subspace of a 5 dimensional space.
Since there are three linearly independent vectors, they span a 3 dimensional space.
Notice that the vectors each have 5 coordinates to them. Therefore they actually span a 3 dimensional subspace of a 5 dimensional space.
← Didn't Know|Knew It →
is a singular four-by-four matrix. True or false:
must also be a singular matrix.
is a singular four-by-four matrix. True or false:
must also be a singular matrix.
Tap to reveal answer
A matrix is singular - that is, it has no inverse - if and only if its determinant is equal to 0.
is singular, so
.
The determinant of the scalar product of
and an
matrix
is
;
setting
,
,
:


Therefore,
, having determinant 0, is also singular.
A matrix is singular - that is, it has no inverse - if and only if its determinant is equal to 0. is singular, so
.
The determinant of the scalar product of and an
matrix
is
;
setting ,
,
:
Therefore, , having determinant 0, is also singular.
← Didn't Know|Knew It →
In a 5 dimensional vector space, what is the maximum number of vectors you can have in a linearly dependent set?
In a 5 dimensional vector space, what is the maximum number of vectors you can have in a linearly dependent set?
Tap to reveal answer
Linearly dependent sets have no limit to the number of vectors they can have.
Linearly dependent sets have no limit to the number of vectors they can have.
← Didn't Know|Knew It →
Consider the following set of three vectors:

where 
Is the set linearly independent?
Consider the following set of three vectors:
where
Is the set linearly independent?
Tap to reveal answer
Since
can be written as a linear combination of of
and
then the set cannot be linearly independent.
Since can be written as a linear combination of of
and
then the set cannot be linearly independent.
← Didn't Know|Knew It →
Does the following row reduced echelon form of a matrix represent a linearly independent set?

Does the following row reduced echelon form of a matrix represent a linearly independent set?
Tap to reveal answer
The set is linearly dependent because there is a row of all zeros.
Notice that having columns of all zeros does not tell if the set is linearly independent or not.
The set is linearly dependent because there is a row of all zeros.
Notice that having columns of all zeros does not tell if the set is linearly independent or not.
← Didn't Know|Knew It →
Does the following row reduced echelon form of a matrix represent a linearly independent set?

Does the following row reduced echelon form of a matrix represent a linearly independent set?
Tap to reveal answer
The set must be linearly independent because there are no rows of all zeros. There are columns of all zeros, but columns do not tell us if the set is linearly independent or not.
The set must be linearly independent because there are no rows of all zeros. There are columns of all zeros, but columns do not tell us if the set is linearly independent or not.
← Didn't Know|Knew It →
is a nonsingular matrix.
True or false: the inverse of the matrix
is
.
is a nonsingular matrix.
True or false: the inverse of the matrix is
.
Tap to reveal answer
By definition,
and
.
Multiply:

Similarly,

Therefore,
is the inverse of
.
By definition,
and
.
Multiply:
Similarly,
Therefore, is the inverse of
.
← Didn't Know|Knew It →

Find
.
Find .
Tap to reveal answer
To find the inverse of a matrix
, set up an augmented matrix
, as shown below:

Perform row operations on this matrix until it is in reduced row-echelon form.
The following operations are arguably the easiest:












The augmented matrix is in reduced row-echelon form
. The inverse is therefore
.
To find the inverse of a matrix , set up an augmented matrix
, as shown below:
Perform row operations on this matrix until it is in reduced row-echelon form.
The following operations are arguably the easiest:
The augmented matrix is in reduced row-echelon form . The inverse is therefore
.
← Didn't Know|Knew It →
Consider the mapping
. Can f be an isomorphism?
(Hint: Think about dimension's role in isomorphism)
Consider the mapping . Can f be an isomorphism?
(Hint: Think about dimension's role in isomorphism)
Tap to reveal answer
No, f, cannot be an isomorphism. This is because
and
have different dimension. Isomorphisms cannot exist between vector spaces of different dimension.
No, f, cannot be an isomorphism. This is because and
have different dimension. Isomorphisms cannot exist between vector spaces of different dimension.
← Didn't Know|Knew It →
The last question showed us isomorphisms must be between vector spaces of the same dimension. This question now asks about homomorphisms.
Consider the mapping
. Can f be a homomorphism?
The last question showed us isomorphisms must be between vector spaces of the same dimension. This question now asks about homomorphisms.
Consider the mapping . Can f be a homomorphism?
Tap to reveal answer
The answer is yes. There is no restriction on dimension for homomorphism like there is for isomorphism. Therefore f could be a homomorphism, but it is not guaranteed.
The answer is yes. There is no restriction on dimension for homomorphism like there is for isomorphism. Therefore f could be a homomorphism, but it is not guaranteed.
← Didn't Know|Knew It →
In the previous question, we said an isomorphism cannot be between vector spaces of different dimension. But are all homomorphisms between vector spaces of the same dimension an isomorphism?
Consider the homomorphism
. Is f an isomorphism?
In the previous question, we said an isomorphism cannot be between vector spaces of different dimension. But are all homomorphisms between vector spaces of the same dimension an isomorphism?
Consider the homomorphism . Is f an isomorphism?
Tap to reveal answer
The answer is not enough information. The reason is that it could be an isomorphism because it is between vector spaces of the same dimension, but that doesn't mean it is.
For example:
Consider the zero mapping f(x,y)= (0,0).
This mapping is not onto or 1-to-1 because all elements go to the zero vector. Therefore it is not an isomorphism even though it is a mapping between spaces with the same dimension.
Another example:
Consider the identity mapping f(x,y) = (x,y)
This is an isomorphism. It clearly preserves structure and is both onto and 1-to-1.
Thus f could be an isomorphism (example identity map) or it could NOT be an isomorphism ( Example the zero mapping)
The answer is not enough information. The reason is that it could be an isomorphism because it is between vector spaces of the same dimension, but that doesn't mean it is.
For example:
Consider the zero mapping f(x,y)= (0,0).
This mapping is not onto or 1-to-1 because all elements go to the zero vector. Therefore it is not an isomorphism even though it is a mapping between spaces with the same dimension.
Another example:
Consider the identity mapping f(x,y) = (x,y)
This is an isomorphism. It clearly preserves structure and is both onto and 1-to-1.
Thus f could be an isomorphism (example identity map) or it could NOT be an isomorphism ( Example the zero mapping)
← Didn't Know|Knew It →