Can someone prove that "if there exists C such that CA = I, then AC=I"?
That would imply that C is invertible.
AC= A*I*C=A*(A^(-1)*C^(-1))*C=I*I=I
xyy: why I = A^(-1)*C(-1), you cannot assume the existence of the matrix inverses in the first place.
And to the original question: I checked it up on my linear algebra book, and its too long for me to write here, but the proof exists, and is not just a simpe artithmetic.
Tumas, you are right. If A is not square matrix, I donot think this is a right conclusion. For simple counter example just try Q which is orthonormal matrix, and Q transpose as C.
xyy: Yeah its applicable to square matrices of the same size. (AB=I => BA=I)
Suppose CA = I ------ (1) Suppose Ax = b ------ (2) From (2) Ax = b => (AC)Ax = (AC)b => A(CA)x = (AC)b => Ax = (AC)b ------- (3) (2) = (3) Ax = b = (AC)b Thus, I = AC Got the idea from Zareon at http://www.scienceforums.net/topic/21570-inverse-of-a-matrix/ Not sure if it's correct though
philip, I think the proof is flawed. b = (AC)b => I=AC for one, if b = 0 the above statement is false. The proof introduced the variable b in the Ax=b statement. I don't think one can infer anything about b here. (one can't be sure if it is not 0 vector)
Since CA = I, we know that A is an invertible matrix. Since A is invertible, Ax = b is solvable for all b's. So, we can just assume that b is non-zero.
philip, the point of original statement is. (We are assuming square matrix) if we know a left inverse exist, can we say that the left inverse also is actually a right inverse? "A is an invertible matrix" = "There is a left inverse and right inverse and they are the same." "CA = I" doesn't say there exists a right inverse. Let alone their equivalence. Anyway thanks for reply. good to see someone is taking the same course roughly at the same time :p
actually equivalence part comes automatically. if AC = I and BA = I BAC = B and BAC = C Therefore B=C
Did you just prove it? Looks correct. Anyway, let me try again. If CA = I, then the combination of the rows of A spans the whole vector space. So, rref(A) = I => the columns of A are independent (still haven't really wrapped my head around how row independence implies column independence. neat fact) => the columns of A span the whole vector space => Ax = b is solvable for any b (since the columns of A span the whole space) So let b be a non-zero vector. And proceed with the proof I showed previously. Does this work?
If CA = I, then the combination of the rows of A spans the whole vector space. So, rref(A) = I Why is that? I guess you are assuming C is a product of elementary row operation. Can an arbitrary matrix be considered as a product of elementary row operation? What if a diagonal entry is zero?
To pcompassion: If CA=I, then I is the combination of the rows of A, then the row space of I is a subspace of the row space of A. Since the row space of I is the whole vector space, the row space of A must be the whole vector space too. Suppose C is m by n, A is n by m, I is m by m. Then Rm is a subspace of Rm. So A’s rank is m and it has m independent columns and rows. If n=m, then A is square, rref (A) is I (m by m). if m<n, rref(A) is I (m by m) with zero in other rows. M cannot be larger than n.
Thanks yadi. Since we are assuming square matrix, combining your answer with philips completes the proof.
Look to this: If AC=I, obviously A=C^(-1) so AC = CA = I ELL ( Easy like life )
Closed Look C = A^(-1) and A=C^(-1) then AC=CA
philip, where in the book or lecture says linear independence of rows implies independence of columns? I saw them too, but can't find it. Augusto.ACM look previous discussions, the question asks if there is a left inverse, would it be also an right inverse? you can't assume existence of A(-1)
Join our real-time social learning platform and learn together with your friends!