When you are solving a system of first order equations which can be represented by the 2x2 matrix A which has one eigenvector of multipicity 2, is how do you represent the general solution? Do you need the generalised eigenvector or not?
We may or may not need generalized eigenvectors. The algebraic multiplicity of the eigenvalue is 2, but we dont know about the geometric multiplicity (the dimension of the Eigenspace/Ker(A-lambda*I)/Null space of (A-lambda*I)/whatever you want to call it lol)
For a 2x2 matrix it would be pretty easy to tell the geometric multiplicity though. if the geometric multiplicity is 2, it follows that the matrix A would have to be a multiple of the identity matrix. Example: \[\left[\begin{matrix}2 & 0 \\ 0 & 2\end{matrix}\right]\] has an eigenvalue of algebraic multiplicity 2, and A-2I is:\[\left[\begin{matrix}0 & 0 \\ 0 & 0\end{matrix}\right]\] which sends all vectors in R^2 into the 0 vector, so the dimension of the eigenspace is 2.
Ok, so I will apply a question. \[x^{'} = \left[\begin{matrix}-2 & 1 \\ -1 & 4\end{matrix}\right], x(0)=\left(\begin{matrix}1 \\ 1\end{matrix}\right)\] is the IVP we need to solve.
EDIT: Element A(2,2) should be -4
This one you will need generalized eigenvectors. the eigenvalue is -3 with algebraic multiplicity 2, but you can only find one eigenvector looking at A+3I
Yep, I got lambda =-3 and the corresponding eigenvector (1,-1). Is the generalised eigenvector when you make (A-lambdaI)x=v as opposed to (A-lambdaI)x=0 when finding v?
im getting \[\left(\begin{matrix}-1 \\ 1\end{matrix}\right)\] as the eigenvector. So to find the generalized eigenvector, you need to solve the equation:\[(A+3I)x=\left(\begin{matrix}-1 \\ 1\end{matrix}\right)\]
yes you are correct :)
Medal for you! Our eigenvectors are the same by the way. When you are drawing the phase portrait of this system, will it be all points getting attracted to the eigenvector? And what does the generalised eigenvector actually represent?
|dw:1319405723598:dw|
Let me see if i can get a copy of whats in my book on here. Since the eigenvalues are negative and equal, everything should be going into the origin via straight lines. one sec.
Oh, yes. It's a asymptotically stable proper node.
Our case here will be the matrix:\[\left[\begin{matrix}\lambda & 1 \\ 0 & \lambda \end{matrix}\right]\]
Join our real-time social learning platform and learn together with your friends!