In mathematics, a matrix function is a function which maps a matrix to another matrix.
Elementary matrix
In mathematics, an elementary matrix is a matrix which differs from the identity matrix by one single elementary row operation.
Strassen algorithm
In linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm for matrix multiplication.
Cayley–Hamilton theorem
In linear algebra, the Cayley–Hamilton theorem (named after the mathematicians Arthur Cayley and William Rowan Hamilton) states that every square matrix over a commutative ring (such as the real or complex field) satisfies its own characteristic equation.
Determinant
In linear algebra, the determinant is a useful value that can be computed from the elements of a square matrix.
Eigenvalues and eigenvectors
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it.
Spectral theorem
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or matrices.
Frobenius covariant
In matrix theory, the Frobenius covariants of a square matrix A are special polynomials of it, namely projection matrices Ai associated with the eigenvalues and eigenvectors of A.
Matrix exponential
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
Trigonometric functions of matrices
The trigonometric functions (especially sine and cosine) for real or complex square matrices occur in solutions of second-order systems of differential equations.
Moore–Penrose pseudoinverse
In mathematics, and in particular linear algebra, a pseudoinverse A+ of a matrix A is a generalization of the inverse matrix.
Change of basis
In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors.
Adjugate matrix
In linear algebra, the adjugate, classical adjoint, or adjunct of a square matrix is the transpose of its cofactor matrix.
Matrix polynomial
In mathematics, a matrix polynomial is a polynomial with matrices as variables.
Generalized eigenvector
In linear algebra, a generalized eigenvector of an n × n matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.
In mathematics, a matrix function is a function which maps a matrix to another matrix.
Elementary matrix
In mathematics, an elementary matrix is a matrix which differs from the identity matrix by one single elementary row operation.
Strassen algorithm
In linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm for matrix multiplication.
Cayley–Hamilton theorem
In linear algebra, the Cayley–Hamilton theorem (named after the mathematicians Arthur Cayley and William Rowan Hamilton) states that every square matrix over a commutative ring (such as the real or complex field) satisfies its own characteristic equation.
Determinant
In linear algebra, the determinant is a useful value that can be computed from the elements of a square matrix.
Eigenvalues and eigenvectors
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it.
Spectral theorem
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or matrices.
Frobenius covariant
In matrix theory, the Frobenius covariants of a square matrix A are special polynomials of it, namely projection matrices Ai associated with the eigenvalues and eigenvectors of A.
Matrix exponential
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
Trigonometric functions of matrices
The trigonometric functions (especially sine and cosine) for real or complex square matrices occur in solutions of second-order systems of differential equations.
Moore–Penrose pseudoinverse
In mathematics, and in particular linear algebra, a pseudoinverse A+ of a matrix A is a generalization of the inverse matrix.
Change of basis
In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors.
Adjugate matrix
In linear algebra, the adjugate, classical adjoint, or adjunct of a square matrix is the transpose of its cofactor matrix.
Matrix polynomial
In mathematics, a matrix polynomial is a polynomial with matrices as variables.
Generalized eigenvector
In linear algebra, a generalized eigenvector of an n × n matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.