In mathematics, an elementary matrix is a matrix which differs from the identity matrix by one single elementary row operation.
Moore–Penrose pseudoinverse
In mathematics, and in particular linear algebra, a pseudoinverse A+ of a matrix A is a generalization of the inverse matrix.
Matrix function
In mathematics, a matrix function is a function which maps a matrix to another matrix.
Eigenvalues and eigenvectors
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it.
Adjugate matrix
In linear algebra, the adjugate, classical adjoint, or adjunct of a square matrix is the transpose of its cofactor matrix.
Matrix exponential
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
Spectral theorem
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or matrices.
Cayley–Hamilton theorem
In linear algebra, the Cayley–Hamilton theorem (named after the mathematicians Arthur Cayley and William Rowan Hamilton) states that every square matrix over a commutative ring (such as the real or complex field) satisfies its own characteristic equation.
Determinant
In linear algebra, the determinant is a useful value that can be computed from the elements of a square matrix.
Strassen algorithm
In linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm for matrix multiplication.
Frobenius covariant
In matrix theory, the Frobenius covariants of a square matrix A are special polynomials of it, namely projection matrices Ai associated with the eigenvalues and eigenvectors of A.
Trigonometric functions of matrices
The trigonometric functions (especially sine and cosine) for real or complex square matrices occur in solutions of second-order systems of differential equations.
Change of basis
In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors.
Matrix polynomial
In mathematics, a matrix polynomial is a polynomial with matrices as variables.
Sylvester's formula
In matrix theory, Sylvester's formula or Sylvester's matrix theorem (named after J. J. Sylvester) or Lagrange−Sylvester interpolation expresses an analytic function f(A) of a matrix A as a polynomial in A, in terms of the eigenvalues and eigenvectors of A.
Generalized eigenvector
In linear algebra, a generalized eigenvector of an n × n matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.
In mathematics, an elementary matrix is a matrix which differs from the identity matrix by one single elementary row operation.
Moore–Penrose pseudoinverse
In mathematics, and in particular linear algebra, a pseudoinverse A+ of a matrix A is a generalization of the inverse matrix.
Matrix function
In mathematics, a matrix function is a function which maps a matrix to another matrix.
Eigenvalues and eigenvectors
In linear algebra, an eigenvector or characteristic vector of a linear transformation is a non-zero vector that does not change its direction when that linear transformation is applied to it.
Adjugate matrix
In linear algebra, the adjugate, classical adjoint, or adjunct of a square matrix is the transpose of its cofactor matrix.
Matrix exponential
In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function.
Spectral theorem
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or matrices.
Cayley–Hamilton theorem
In linear algebra, the Cayley–Hamilton theorem (named after the mathematicians Arthur Cayley and William Rowan Hamilton) states that every square matrix over a commutative ring (such as the real or complex field) satisfies its own characteristic equation.
Determinant
In linear algebra, the determinant is a useful value that can be computed from the elements of a square matrix.
Strassen algorithm
In linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm for matrix multiplication.
Frobenius covariant
In matrix theory, the Frobenius covariants of a square matrix A are special polynomials of it, namely projection matrices Ai associated with the eigenvalues and eigenvectors of A.
Trigonometric functions of matrices
The trigonometric functions (especially sine and cosine) for real or complex square matrices occur in solutions of second-order systems of differential equations.
Change of basis
In linear algebra, a basis for a vector space of dimension n is a set of n vectors (α1, …, αn), called basis vectors, with the property that every vector in the space can be expressed as a unique linear combination of the basis vectors.
Matrix polynomial
In mathematics, a matrix polynomial is a polynomial with matrices as variables.
Sylvester's formula
In matrix theory, Sylvester's formula or Sylvester's matrix theorem (named after J. J. Sylvester) or Lagrange−Sylvester interpolation expresses an analytic function f(A) of a matrix A as a polynomial in A, in terms of the eigenvalues and eigenvectors of A.
Generalized eigenvector
In linear algebra, a generalized eigenvector of an n × n matrix is a vector which satisfies certain criteria which are more relaxed than those for an (ordinary) eigenvector.
Studylib tips
Did you forget to review your flashcards?
Try the Chrome extension that turns your New Tab screen into a flashcards viewer!
The idea behind Studylib Extension is that reviewing flashcards will be easier if we distribute all flashcards reviewing into smaller sessions throughout the working day.