Uploaded by nihaara.sawhney7

ALA Cheatsheet

advertisement
MATH0047: ADVANCED LINEAR ALGEBRA
CHEATSHEET
CHAPTER 1: Matrices and Linear Equations
Size of matrix
Matrix Multiplication
Trace
Sum of entries along main diagonal
Transpose
Flip matrix- make rows columns and
columns rows
Commutative Addition Property
Distributive Property
Associative Property
Multiplication is NOT Commutative
Transpose of Transpose
Transpose of Sum
Transpose of Product
Trace of Product
Inverse Matrices:
ο‚· A (nxn) has inverse B (nxn) iff. >
ο‚·
If A and B are invertible (nxn) matrixes, (AB) will also be invertible with inverse
Gauss Jordan Elimination- reduction to reduced row echelon form
https://tinyurl.com/gaussjordan1
Conditions for Reduced row echelon form:
1) Each zero row appears below all non-zero rows
2) For each non-zero row, the first (leftmost) non-zero entry is a 1
3) The leading 1 of the lower row is to the right of the leading 1 of the row above
4) Each column that contains a leading 1 has zeros everywhere else
Without (4) > matrix is in row echelon form (NOT reduced)
Possible operations:
1) Multiply by non-zero number
2) Add πœ† times a row to another row
3) Permute rows i and j
ο‚·
ο‚·
System of real linear equations can have- 0 or 1 or ∞ solutions
Can generate ∞ solutions from 2 solutions by taking combinations that add up to 1
Homogenous System:
ο‚·
Can have 1 (zero solution) or ∞ solutions
Elementary Matrices
ο‚· Multiplying (on left) by an elementary matrix corresponds to a row operation
ο‚· Conduct the row operation on the identity matrix to obtain the elementary matrix
Now AB is the same as doing the row operation on B
Forms of elementary matrices:
Same as identity matrix I except that-1) Diagonal Matrix
One non-zero entry on diagonal
2) Elementary row operation
One non-diagonal entry is non-zero
3) Permute/ Swap
Rows i and j are interchanged
Finding the inverse of Aο‚· Take augmented matrix with A on LHS, I on RHS
ο‚· After row reduction, if LHS is I, then RHS is inverse
ο‚·
To get A-1 > flip the order of operations and inverse them
Properties of invertible matrix A:
1. There is a unique sol to
2. Row reduced echelon form of A is In
3. A can be expressed as a product of elementary matrices
CHAPTER 2: Determinants
Det=2
Det<0
Det=0
Volume of object has doubled
Reflection/ Flip
Object has shrunk completely (lost)
𝑆𝑖𝑔𝑛(𝜎) = (−1)π‘˜ π‘€β„Žπ‘’π‘Ÿπ‘’ π‘˜ 𝑖𝑠 π‘‘β„Žπ‘’ π‘›π‘’π‘šπ‘π‘’π‘Ÿ π‘œπ‘“ π‘–π‘›π‘£π‘’π‘Ÿπ‘ π‘–π‘œπ‘›π‘ 
ο‚· Inversion- Permutation when a larger number precedes a smaller one
ο‚· Ex: (3,2,1) has 3 inversions
Elementary Productο‚· Exactly 1 entry from each row and column
ο‚· Ex:
Determinant of Sq Matrixο‚· Sum of all signed elementary products of A
If:
Minor of A:
Cofactor of an entry Aij
Cofactor Expansion Method
Cofactor expansion along row 1:
Determinant results:
1. If A has a zero row or column, detA=0
2.
3. For upper, lower and diagonal triangular matrices,
Product of entries along diagonal
4.
5. 𝐷𝑒𝑑(𝐴−1 ) =
1
det 𝐴
But note > det(A+B) ≠det(A) +det(B)
Operations and their effects on det(A)
A is invertible iff det(A)≠0
CHAPTER 3: Vector Spaces
ο‚·
ο‚·
Multiplication of vector by constant πœ† can be interpreted as a ‘stretch’
Minus sign flips/ reflects the vector
Real Vector Spaces
Dot/ Inner Product
Properties of Inner Product
For real vectors
Length/ Norm of vector
Orthogonal Vectors
Angle b/w 2 vectors
(dot product)
Cauchy-Schwarz Inequality
Triangle Inequality
Complex Vector Spaces
Length of Complex Number a
Inner Product
Of complex vectors
Properties of Inner Product
For Complex Vectors
Polynomials (degree ≤ 2)
Inner Product
For Polynomials
Properties
Norm of Polynomial
Subspace of Rn is a subset S of Rn satisfying:
i)
0 vector belongs to S
ii)
If
iii)
If
Span:
ο‚· A set of vectors (v1,…,vm) spans a subspace V if every vector in V can be written in
the form:
ο‚·
Check if a vector lies in the span of 2 vectors using row reduction-
Linear Independence
ο‚· If Ax=0 ONLY has a zero solution > linearly independence
ο‚· Otherwise, they are linearly dependent
ο‚· Independent when A is an invertible matrix [i.e., det(A) ≠ 0]
ο‚· Row reduced form should be identity matrix for linearly independent vectors
BASE:
ο‚· Can obtain any vector in Rn through a linear combination of the vectors in the set
ο‚· Set {v1,...,vm} is a basis for V (subspace of Rn) ifo Set {v1,...,vm} spans V
o v1,...,vm are linearly independent
Dimension: Dim(V)= no. of vectors in a basis for V
Orthonormal Vectors:
1. Each vector has norm 1
2. Any 2 vectors are orthogonal to each other
Gram-Schmidt (Orthonormalization) Process
For 2 vectors u1 and u2
1. Reference vector
2. Find
3. Make norms 1:
Orthogonal Complement:
If V is a subspace of Rn, then 𝑉 ⊥ will also be a subspace of Rn
Fourier Series:
CHAPTER 4: Linear Maps
Linear Map
ο‚· Linear Map T from Rn to Rm satisfying
i) T (0) = 0
ii)
iii)
Linear Map οƒ  matrix:
Kernel
Set of vectors that the map sends to 0
Image
Set of vectors attained by the map
Nullity
Dimension of kernel
Rank
Dimension of image
π‘…π‘Žπ‘›π‘˜(𝑇) + 𝑁𝑒𝑙𝑙(𝑇) = 𝑛
Eigenvectors:
ο‚· Eigenvector of A is a non-zero vector v in Rn such that:
ο‚·
πœ† is the eigenvalue
Find eigenvalues by solving the characteristic equation-
ο‚·
Find eigenvector by substitute in the value of πœ† and solve (𝐴 − πœ†πΌ)π‘₯ = 0
ο‚·
ο‚·
Eigenvector v with eigenvalue πœ† for A is also eigenvector for 𝐴𝑛 with eigenvalue πœ†π‘›
Any non-zero vector in the kernel of a linear map is an eigenvector with eigenvalue 0
Algebraic Multiplicity:
ο‚· No. of times the eigenvalue πœ† appears as a root of the characteristic equation
Geometric Multiplicity:
ο‚· Largest no. of linearly independent eigenvectors corresponding to an eigenvalue πœ†
Diagonalisation:
ο‚· A is diagonalisable if there exists an invertible matrix P such that P-1AP is a diagonal
matrix
ο‚· Matrix CANNOT be diagonalised if algebraic multiplicity ≠ geometric multiplicity
Using the diagonalised matrix of A to find a formula for 𝐴𝑛 :
Hermitian/ Self-Adjoint Matrix
Symmetrix Matrix
Properties of Hermitian Matrix A:
i)
A is diagonalisable
ii)
Each eigenvalue of A is a real number
iii)
A has an orthonormal set of n eigenvectors
Download