How can i show this: \[(\textbf{AB})^{-1}=\textbf{B}^{-1} \textbf{A}^{-1}\]
I guess you can do this by showing that \[\mathbf{B}^{-1}\mathbf{A}^{-1}\]is a solution to \[(\mathbf{AB})\text{x}=\mathbf{I}\] where I is the identity matrix.
is there another way something more like these
Sorry, man. We haven't taken up linear algebra yet.
\[ (B^{-1} A^{-1}) (A B)=B^{-1}( A^{-1} A) B=B^{-1}(I) B==B^{-1} B=I \]
how come parentheses can be rearranged?
Associativty of Matrix Multiplication.
From which you can deduce that you can put the parenthese wherever you want.
hmmm. yeah i guess. i am trying so show \[{(\textbf{ST})}^{-1}= \dots\]\[= \dots\]\[=(\textbf{T}^{-1}\textbf{S}^{-1})\] can i do this without deducing from a form of the identity ?
The inverse of a matirx A is a matrix B such that \[ A B = B A = I \]
im still not convinced
what eliassaab is correct. how comfortable are you with abstract algebra? If you think of the set of all invertible matrices under matrix multiplication as a group, you would come to the same result.
Why can't i show that \[{(\textbf{ST})}^{-1}= \dots\] \[\dots=(\textbf{T}^{-1}\textbf{S}^{-1})\]
I think you want a proof that works with the objects inside the matrix, but the beauty of linear algebra is that at some point you start to no worry about whats inside, but rather what properties the linear transformation has. There does exist a proof that you are looking for out there I imagine, but why work that hard when its basically given to you on a silver plate from group theory?
ah ok so above solutions work because there is only one possible inverse
(i have not studied group theory yet
\[ (\textbf{ST}) {(\textbf{ST})}^{-1} = \textbf {I}\] \[\textbf{T}{(\textbf{ST})}^{-1} = \textbf {S}^{-1}\] \[{(\textbf{ST})}^{-1} = \textbf{T}^{-1}\textbf {S}^{-1}\]
Join our real-time social learning platform and learn together with your friends!