MATH 310: Applied Linear Algebra

Glossary of Linear Algebra Terms

Thanks to Gene Herman for compiling this Glossary as part of his Math 215 Homepage at Grinnell University.

algebraic multiplicity of an eigenvalue:

The algebraic multiplicity of an eigenvalue c of a matrix A is the number of times the factor (t-c) occurs in the characteristic polynomial of A.

basis for a subspace:

A basis for a subspace W is a set of vectors {v1, ..., vk} in W such that:

1. {v1, ..., vk} is linearly independent; and
2. {v1, ..., vk} spans W.
characteristic polynomial of a matrix:

The characteristic polynomial of a n by n matrix A is the polynomial in t given by the formula det(A - tI).

column space of a matrix:

The column space of a matrix is the subspace spanned by the columns of the matrix considered as a set of vectors. See also: row space.

consistent linear system:

A system of linear equations is consistent if it has at least one solution. See also: inconsistent.

defective matrix:
A matrix A is defective if A has an eigenvalue whose geometric multiplicity is less than its algebraic multiplicity.
diagonalizable matrix:
A matrix is diagonalizable if it is similar to a diagonal matrix.
dimension of a subspace:

The dimension of a subspace W is the number of vectors in any basis of W. (If W is the subspace {0}, we say that its dimension is 0.)

echelon form of a matrix:

A matrix is in row echelon form if:

1. all rows that consist entirely of zeros are grouped together at the bottom of the matrix; and
2. the first (counting left to right) nonzero entry in each nonzero row appears in a column to the right of the first nonzero entry in the preceding row (if there is a preceding row).
eigenspace of a matrix:
The eigenspace associated with the eigenvalue c of a matrix A is the null space of A - cI.
eigenvalue of a matrix:

An eigenvalue of a square matrix A is a scalar c such that Ax = cx holds for some nonzero vector x. See also: eigenvector.

eigenvector of a matrix:

An eigenvector of a square matrix A is a nonzero vector x such that Ax = cx holds for some scalar c. See also: eigenvalue.

elementary matrix:
An elementary matrix is a matrix that is obtained by performing an elementary row operation on an identity matrix.
equivalent linear systems:
Two systems of linear equations in n unknowns are equivalent if they have the same set of solutions.
geometric multiplicity of an eigenvalue:

The geometric multiplicity of an eigenvalue c of a matrix A is the dimension of the eigenspace of c.

homogeneous linear system:
A system of linear equations Ax = b is homogeneous if b = 0.
inconsistent linear system:

A system of linear equations is inconsistent if it has no solutions. See also: consistent.

inverse of a matrix:

The matrix B is an inverse for the matrix A if AB = BA = I.

invertible matrix:
A matrix is invertible if it has an inverse.
least-squares solution of a linear system:
A least-squares solution to a system of linear equations Ax = b is a vector x that minimizes the length of the vector Ax - b.
linear combination of vectors:

A vector v is a linear combination of the vectors v1, ..., vk if there exist scalars a1, ..., ak such that v = a1v1+ ...+ akvk.

linear dependence relation for a set of vectors:

A linear dependence relation for the set of vectors {v1, ..., vk} is an equation of the form a1v1+ ...+ akvk = 0, where not all the scalars a1, ..., ak are zero.

linearly dependent set of vectors:
The set of vectors {v1, ..., vk} is linearly dependent if the equation a1v1+ ...+ akvk = 0 has a solution where not all the scalars a1, ..., ak are zero (i.e., if {v1, ..., vk} satisfies a linear dependence relation).
linearly independent set of vectors:

The set of vectors {v1, ..., vk} is linearly independent if the only solution to the equation a1v1+ ...+ akvk = 0 is the solution where all the scalars a1, ..., ak are zero. (i.e., if {v1, ..., vk} does not satisfy any linear dependence relation).

linear transformation:

A linear transformation from V to W is a function T from V to W such that:

1. T(u+v) = T(u) + T(v) for all vectors u and v in V; and
2. T(av) = aT(v) for all vectors v in V and all scalars a.
nonsingular matrix:

A square matrix A is nonsingular if the only solution to the equation Ax = 0 is x = 0. See also: singular.

null space of a matrix:

The null space of a m by n matrix A is the set of all vectors x in Rn such that Ax = 0.

null space of a linear transformation:
The null space of a linear transformation T is the set of vectors v in its domain such that T(v) = 0.
nullity of a matrix:
The nullity of a matrix is the dimension of its null space.
nullity of a linear transformation:
The nullity of a linear transformation is the dimension of its null space.
orthogonal complement of a subspace:
The orthogonal complement of a subspace S of Rn is the set of all vectors v in Rn such that v is orthogonal to every vector in S.
orthogonal set of vectors:

A set of vectors in Rn is orthogonal if the dot product of any two of them is 0.

orthogonal matrix:
A matrix A is orthogonal if A is invertible and its inverse equals its transpose; i.e., A-1 = AT.
orthogonal linear transformation:
A linear transformation T from V to W is orthogonal if T(v) has the same length as v for all vectors v in V.
orthonormal set of vectors:
A set of vectors in Rn is orthonormal if it is an orthogonal set and each vector has length 1.
range of a linear transformation:

The range of a linear transformation T is the set of all vectors T(v), where v is any vector in its domain.

rank of a matrix:
The rank of a matrix A is the number of nonzero rows in the reduced row echelon form of A;
i.e., the dimension of the row space of A.
rank of a linear transformation:
The rank of a linear transformation (and hence of any matrix regarded as a linear transformation) is the dimension of its range. Note: A theorem tells us that the two definitions of rank of a matrix are equivalent.
reduced row echelon form of a matrix:

A matrix is in reduced row echelon form if:

1. the matrix is in row echelon form;
2. the first nonzero entry in each nonzero row is the number 1; and
3. the first nonzero entry in each nonzero row is the only nonzero entry in its column.
row equivalent matrices:
Two matrices are row equivalent if one can be obtained from the other by a sequence of elementary row operations.
row operations:

The elementary row operations which can be performed on a matrix are: * interchange two rows; * multiply a row by a nonzero scalar; * add a constant multiple of one row to another.

row space of a matrix:

The row space of a matrix is the subspace spanned by the rows of the matrix considered as a set of vectors. See also: column space.

similar matrices:

Matrices A and B are similar if there is a square invertible matrix S such that S-1AS = B.

singular matrix:

A square matrix A is singular if the equation Ax = 0 has a nonzero solution for x. See also: nonsingular.

span of a set of vectors:

The span of the set of vectors {v1, ..., vk} is the subspace V consisting of all linear combinations of v1, ..., vk. One also says that the subspace V is spanned by the set of vectors {v1, ..., vk} and that this set of vectors spans V.

subspace:

A subset W of Rn is a subspace of Rn if:

1. the zero vector is in W;
2. x+y is in W whenever x and y are in W; and
3. ax is in W whenever x is in W and a is any scalar.
symmetric matrix:
A matrix A is symmetric if it equals its transpose; i.e., A = AT.