A field has two operations, and . They must satisfy the following properties
Commutativity,
Associativity,
Identity,
Inverses,
Distributive
Vector Spaces
A vector space over a field must satisfy the following properties
Commutativity of Addition
Associativity of Addition
Additive Identity
Additive Inverse
Multiplicative Identity
Associativity of Multiplication
Distributivity of Scalars
Distributivity of Vectors
Subspaces
A subspace must satisfy the following properties
Additive Closure
Multiplicative Closure
Zero
Inverse
Linear Dependence
For a vector space over field , a linear combination of and scalars is
The span of is the set of all linear combinations of vectors in , denoted . By definition, .
A subset generates if
A subset is linearly dependent if there are vectors and scalars not all zero such that
Bases and Dimension
A basis for is a linearly independent subset of that generates .
Replacement Theorem
For generated by containing vectors, take a linearly independent with vectors. Then and there is with vectors such that generates
The dimension of is the size of any basis, denoted . A vector space is finite-dimensional if there is a basis with a finite number of vectors.
Linear Transformations
A linear transformation from to is denoted if the following are satisfied
Some addition properties of are
The identity transformation over is , and the zero transformation is .
For a given linear transformation , the null space is the set of all vectors such that , and the range is the set of all possible output vectors.
From linearity, the null space and range are subspaces.
Dimension Theorem
For linear , then
For linear transformations and , the composition can be denoted or .
For linear and , both and are linear.
is the set of all linear transformations from to
Matrix Representations
An ordered basis of is a basis with a specific order, or a finite sequence of linearly independent vectors that generates .
For basis and with representation , then the coordinate vector relative to is
For a linear transformation with basis of and of , the matrix representation is , or if .
is the set of all matrices over the field .
Note that , , and
For a matrix , is the left multiplication transform. In this sense, matrix multiplication is analogous to composition of transformations.
Invertibility
For linear , is the inverse if . If has an inverse, it is invertible. For invertible functions and , the following properties hold
For vector spaces and , if there is an invertible , and are isomorphic and is an isomorphism.
For vector spaces and over with dimension and respectively, take and to be their ordered bases. Then given by is an isomorphism.
Systems
Elementary Matrix Operations
For an matrix , the following are elementary row operations. (these also exist for columns)
interchanging two rows
multiplying any row with a nonzero scalar
adding a scalar multiple of one row to another
Each elementary operation has a corresponding elementary matrix, where the representative row or column operation is equivalent to left or right multiplication.
Elementary matrices are invertible, and any invertible matrix can be represented as a composition of finite elementary row matrices.
Systems
A system of equations can be represented as a matrix equation . This system is homogeneous if , and nonhomogeneous if not.
Let be the solution set of and the solution set of the homogeneous system , Then for any solution to .
Determinants
is the matrix obtained by removing the row and column from the matrix .
The determinant for is given as follows. If , then . For
The scalar is the cofactor of in row , column
When applying elementary row operations to , the determinant will change in only so many ways
If is obtained by interchanging rows of , then
If is obtained by scaling a row of by k, then
If is obtained by adding a multiple of one row to another in , then
For , . If is invertible, then
Cramer's Rule
For the system of linear equations and unknowns, if , then the system has a unique solution, where for each
Where is the matrix obtained by replacing column of by
Diagonalization
Eigenvalues and Eigenvectors
A linear operator is diagonalizable if there is some basis such that is a diagonal matrix.
A vector is an eigenvector of if there is a scalar eigenvalue such that . The set of all eigenvectors for an eigenvalue is the eigenspace, .
If a matrix is diagonalizable, then taking a basis of eigenvectors and then constructing the matrix , will be a diagonal matrix, made up of the eigenvalues of .
is also an eigenvector of , if for the eigenvalue
The characteristic polynomial of is given . This polynomial splits if there are scalars such that . The characteristic polynomial splits if the corresponding linear map is diagonalizable.
The algebraic multiplicity of is the largest such that divides . The geometric multiplicity is the dimension of the corresponding eigenspace .
Invariant Subspaces
For a linear operator on , is a -invariant subspace if . For some , the -cyclic subspace of generated by is .
The characteristic polynomial of (restriction of on ) divides the characteristic polynomial of .
The Cayley-Hamilton theorem states that applying the characteristic polynomial to its linear operator will return the zero transformation , and similar results will hold for the respective matrix 𝟘.
Jordan Canonical Form
A Jordan block is a matrix with a constant on the diagonal, and 1s on the upper super-diagonal.
A Jordan canonical form is a matrix where the diagonals are made up of Jordan blocks, and everything else is 0. A Jordan canonical basis for is one such that is a Jordan canonical form.
A generalized eigenvector of corresponding to satisfies for some positive integer . The generalized eigenspace is denoted .
For some generalized eigenvector , then the cycle of generalized eigenvectors is , where is the minimal p such that . The initial vector is , and the end vector is . This is a cycle of length .
Two matrices and are similar if they can be be brought to the same Jordan canonical form with a choice of basis.