Ask your own question, for FREE!
Linear Algebra 42 Online
OpenStudy (anonymous):

Suppose A is any square matrix such that A^2 + 2A + 3I = 0. Show that A is invertible.

OpenStudy (anonymous):

What do you know about invertible matrices? What condition has to be satisfied?

OpenStudy (anonymous):

Like let A^-1 be the inverted matrix and I the identity matrix, then [A][A^-1] = I

OpenStudy (anonymous):

And you [A|I] to find I, I know how to invert a matrices, but I don't know how to make it satisfy the equation.

OpenStudy (anonymous):

**correction** to find [A|I] to find A^-1 (above)

OpenStudy (anonymous):

Exactly, now all we need to do is rearrange your equation and find our inverse: \[A^2+2A+3I=0\]\[A^2+2A=-3I\]\[A(A+2)=-3I\]\[A(-\frac{ 1 }{ 3 } (A+2)) = I\] So we know our inverse is: \[A^{-1} = (-\frac{ 1 }{ 3 } (A+2))\]

OpenStudy (anonymous):

Oh okay thanks! For some reason I thought I had to use A= [ a b ; c d ] matrix and then some how solve that in terms of the given equation...

OpenStudy (anonymous):

To be honest, that's exactly how I started doing it but found something online suggesting an approach like this and it seemed a bit easier! You couldn't just prove this with an abcd matrix as it has to be true for all square matrices, not all 2x2 matrices - if you were going down the abcd route then you would have to prove it for 2x2 then somehow use induction to prove it for all nxn. You're most welcome though :)

OpenStudy (anonymous):

oh ya, thats true, didn't think about that! Thanks again!

OpenStudy (anonymous):

Any time buddy!

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!