Ask your own question, for FREE!
MIT 18.06 Linear Algebra, Spring 2010 20 Online
OpenStudy (anonymous):

Why do only non-zero eigenvalues have eigenvectors in the image?

OpenStudy (datanewb):

I'm not sure what your question means. Which image? But to get the ball rolling I will point out that zero eigenvalues only occur if the matrix is singular. In other words, \[\left( A - \lambda I \right)x = 0\ = Ax = 0\] has solutions. Conversely, if the matrix ONLY has non-zero eigenvalues, then that matrix is invertible.

OpenStudy (fwizbang):

The image of the matrix is another term for the column space. Eigenvectors with zero eigenvalues lie in the null space of the matrix.

OpenStudy (datanewb):

Okay, thanks to fwizbang for defining the image of a matrix = the column space of a matrix, I am now in a position to thoroughly (I hope) demonstrate why only non-zero eigenvalues have eigenvectors in the image(column space of A). Eigenvectors are defined as vectors whose direction does not change when multiplied by a matrix, A. That is for a matrix A with eigenvalue = lambda and eigenvector = x:\[Ax = \lambda x\] then \[Ax - \lambda x = 0\]and \[(A - \lambda I) x = 0\] so, if the eigenvalue = 0, then the eigenvector, x, is in the nullspace. That is, \[Ax = 0\] On the other hand, if\[\lambda \neq 0\] then the eigenvector, x, clearly is in the column space. Because equations of the general form Ax = b, are solvable only when b is in the image (column space) of A, and b is in the image of the eigenvector, because it is merely the scalar eigenvalue multiplied times the eigenvector.

OpenStudy (anonymous):

@datanewb what eigenvectors means parallel or perpendicular vectors

OpenStudy (datanewb):

An eigenvector, x, for a sqaure matrix, A, is any vector that when multiplied by the matrix A, comes out parallel to itself. So, in equation form, that looks like: \[Ax = \lambda x\] Take the very simple square matrix : \[ \left[\begin{matrix}1&1\\1&1\end{matrix}\right] \] it has two eigenvectors. One of which is in the column space (parallel to the column space) \[x_{1} = \left[\begin{matrix}1\\1\end{matrix}\right] \\ \left[\begin{matrix}1&1\\1&1\end{matrix}\right] \left[\begin{matrix}1\\1\end{matrix}\right] = \left[\begin{matrix}2\\2\end{matrix}\right] \\\left[\begin{matrix}1\\1\end{matrix}\right] \|\left[\begin{matrix}2\\2\end{matrix}\right] \] For the other eigenvector, it is in the nullspace (for this specific example). The nullspace consists of vectors which are perpendicular to the column space. \[ x_{2} = \left[\begin{matrix}1\\-1\end{matrix}\right] \\ \left[\begin{matrix}1&1\\1&1\end{matrix}\right] \left[\begin{matrix}1\\-1\end{matrix}\right] = \left[\begin{matrix}0\\0\end{matrix}\right] \]

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!