
Eigenvalues and Eigenvectors in Matrix Algebra
Explore how to find eigenvalues and eigenvectors of a matrix, along with understanding characteristic equations, diagonal form reduction, and properties of eigenvalues. Learn the methods and significance of eigenvalues and eigenvectors in linear algebra.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
REDUCTION TO DIAGONAL FORM Chapter 5
EIGEN VALUES AND EIGEN VECTORS OF A MATRIX Eigen vector of a matrix A is a vector represented by a matrix X such that when X is multiplied with matrix A, then the direction of the resultant matrix remains same as vector X. Mathematically, above statement can be represented as: AX = X where A is any arbitrary matrix, are eigen values and X is an eigen vector corresponding to each eigen value. Here, we can see that AX is parallel to X. So, X is an eigen vector.
Method to find eigen vectors and eigen values of any square matrix A We know that, AX = X => AX X = 0 => (A I) X = 0 ..(1) Above condition will be true only if (A I) is singular. That means, |A I| = 0 ..(2) (2) is known as characteristic equation of the matrix. The roots of the characteristic equation are the eigen values of the matrix A.
Now, to find the eigen vectors, we simply put each eigen value into (1) and solve it by Gaussian elimination, that is, convert the augmented matrix (A I) = 0 to row echelon form and solve the linear system of equations thus obtained. Some important properties of eigen values Eigen values of real symmetric and hermitian matrices are real Eigen values of real skew symmetric and skew hermitian matrices are either pure imaginary or zero Eigen values of unitary and orthogonal matrices are of unit modulus | | = 1 If 1, 2 . nare the eigen values of A, then k 1, k 2 .k nare eigen values of kA If 1, 2 . nare the eigen values of A, then 1/ 1, 1/ 2 .1/ nare eigen values of A-1 If 1, 2 . nare the eigen values of A, then 1k, 2k . nkare eigen values of Ak Eigen values of A = Eigen Values of AT(Transpose) Sum of Eigen Values = Trace of A (Sum of diagonal elements of A)
Product of Eigen Values = |A| Maximum number of distinct eigen values of A = Size of A If A and B are two matrices of same order then, Eigen values of AB = Eigen values of BA
CHARACHTERISTIC EQUATION OF A MATRIX Since every linear operator is given by left multiplication by some square matrix, finding the eigenvalues and eigenvectors of a linear operator is equivalent to finding the eigenvalues and eigenvectors of the associated square matrix; this is the terminology that will be followed. Furthermore, since eigenvalues and eigenvectors make sense only for square matrices, throughout this section all matrices are assumed to be square. Given a square matrix A, the condition that characterizes an eigenvalue, , is the existence of a nonzero vector x such that A x = x; this equation can be rewritten as follows: This final form of the equation makes it clear that x is the solution of a square,
REDUCTION TO DIAGONAL FORM Diagonalization of a matrix is defined as the process of reducing any matrix A into its diagonal form D. As per the similarity transformation, if the matrix A is related to D, then and the matrix A is reduced to the diagonal matrix D through another matrix P. Where P is a modal matrix)
In simpler words, it is the process of taking a square matrix and converting it into a special type of matrix called a diagonal matrix. Steps Involved: Step 1: Initialize the diagonal matrix D as: where 1, 2, 3 ->eigen values Step 2:Find the eigen values using the equation given below. where,A -> given 3 3 square matrix.I -> identity matrix of size 3 3. -> eigen value. Step 3:Compute the corresponding eigen vectors using the equation given below.
where,i ->eigen value.Xi ->corresponding eigen vector. Step 4: Create the modal matrix P. Here, all the eigenvectors till Xihave filled column-wise in matrix P. Step 5:Find P-1and then use the equation given below to find diagonal matrix D.
CONGURENCE OF MATRICES In mathematics, two square matrices A and B over a field are called congruent if there exists an invertible matrix P over the same field such that PTAP = B where "T" denotes the matrix transpose. Matrix congruence is an equivalence relation. Matrix congruence arises when considering the effect of change of basis on the Gram matrix attached to a bilinear form or quadratic form on a finite-dimensional vector space: two matrices are congruent if and only if they represent the same bilinear form with respect to different bases. Note that Halmos defines congruence in terms of conjugate transpose (with respect to a complex inner product space) rather than transpose,[1]but this definition has not been adopted by most other authors.
REFLEXITY OF MATRIX A relation R is reflexive if the matrix diagonal elements are 1. A relation R is irreflexive if the matrix diagonal elements are 0. A relation R is symmetric if the transpose of relation matrix is equal to its original relation matrix. i.e. M R = (M R) T.