1)Why are the eigen vectors correponding to different eigen values orthogonal to each other? 2)How does cramers rule come about? 3)What are quadratic forms that arise from expressing x^T A x, where x= c1 v1 + c2 v2, where v1 and v2 are eigen vectors(I am looking for the meaning of these forms, what do they reveal about what is going on here)
1) What I have so far, To determine a eigen value for Ax=Lx (A-Li)x=0, since x =/= 0 for non trivial solutions, that means some combination of the columns of (A-LI)=0 for example if x=x1,x2, where x1 and x2 are in interval [-inf,inf] and A-LI had columns V1 and V2, V1 and V2 are in subspace R^n in this case then (A-Li)x=0 x1V1 + x2V2=0 this means thats V2 and V1 are not linearly independant as they are multiples of each other this also means that det(A-Li)=0 The L that satisfies this equation gives us the eigen values, L1,L2...Ln. From here subbing in L back into the matrix means that you now have decreased the rank of the matrix and one or more of the rows and columns is dependant on the other columns and rows, so we can extract an "eigen vector" that arises from the eigen value. These eigen vectors that we are extracting corresponding to different eigen values are orthogonal, and therefore can be used as a basis to rewrite the vector x as a linear combination of the eigen vectors
i should probably go through the process of getting this eigen vector after subbing in an eigen value, how to show that the next eigen value will must always produce an orthogonal vector to this one
@Kainui @ikram002p @ganeshie8
dont mind what i wrote.. im just writing everythign i know and see if theres something that tells me they must be orthogonal
holy shiet
hmm wait.. well i can do the dot product and show that for some arbritary numbers and arbritrary vector size.. the dot product of the 2 vectors are infact 0 , but that isnt really convincing to me, like why did this happen?!
@hartnn
\[\large Ax = \lambda x\] the multiplication of vector x by matrix A gives you back the vector in the same direction
Maybe analyzing the situation in 2D gives some insight on what happens in general consider below : \[\large \pmatrix{a_1&a_2\\b_1&b_2}\pmatrix{\cos \theta \\\sin\theta} = \lambda\pmatrix{\cos \theta\\\sin \theta}\] |dw:1411061445102:dw|
1) Not all eigenvectors are orthogonal. 2) Write out Cramer's rule as linear equations instead of matrices, it might make more sense. 3) Remember when you make a matrix of a quadratic form, the matrix you create is symmetric. This must mean that when you diagonalize it, it too must be symmetric. So clearly there is a relationship between being symmetric and the eigenvectors. Good luck ;)
Join our real-time social learning platform and learn together with your friends!