Ask your own question, for FREE!
Mathematics 20 Online
OpenStudy (rsadhvika):

Prove If A has independent columns, then Ax=0 has only a trivial solution x=0

OpenStudy (zzr0ck3r):

it depends on the space I will assume you mean \(n\) independent columns in some \(n\) dimensional space. Suppose \(Ax=0\) Since \(A\) has full rank, it must have an inverse. Multiply both sides (on the left) by \(A^{-1}\)

OpenStudy (rsadhvika):

A could be rectangular. "Independent columns" only implies a full column rank, right ?

OpenStudy (zzr0ck3r):

full rank \(\iff\) invertable

OpenStudy (rsadhvika):

full column rank need not imply full rank ?

OpenStudy (zzr0ck3r):

what?

OpenStudy (zzr0ck3r):

again, the question, as it is stated, can not be answered. So I made some assumptions.

OpenStudy (rsadhvika):

We can assume a general matrix, A is a mxn matrix with n independent column vectors

OpenStudy (zzr0ck3r):

if the only thing in the kernel is the 0 vector then the matrix has full column rank and thus full row rank

OpenStudy (zzr0ck3r):

check out 'invertable matrix theorem'

OpenStudy (rsadhvika):

That is true only if A is a square matrix

OpenStudy (zzr0ck3r):

oh hahahah you are the person that posted the question...

OpenStudy (rsadhvika):

We cannot define a regular inverse for rectangular matrices Since the question doesn't specifically state the type of matrix, I think it is reasonable assume that the matrix could be rectangular

OpenStudy (zzr0ck3r):

ok sorry forget what I said. I was making assumptions because I was not clear from your question.

OpenStudy (zzr0ck3r):

OK suppose you have \(n\) linearly independent columns and \(Ax=0\), then \(x_1v_1+x_2v_2+...+x_nv_n=0\), correct?

OpenStudy (rsadhvika):

Example matrix equation \[\begin{bmatrix}1&2\\2&3\\1&1\end{bmatrix}\begin{pmatrix}x_1\\x_2\end{pmatrix} = \begin{pmatrix}0\\0\\0\end{pmatrix} \]

OpenStudy (zzr0ck3r):

yes yes, I am with you now. I didn't know you were the guy that asked the qeustion.

OpenStudy (zzr0ck3r):

are you with me above?

OpenStudy (rsadhvika):

Yeah... I think we need to show \(Ax = x_1v_1+x_2v_2+...+x_nv_n=0 \implies x_1=x_2=\cdots = x_n=0 \)

OpenStudy (zzr0ck3r):

How could you add things up to get 0 unless, at some point, you are subtracting some quantity from itself.

OpenStudy (zzr0ck3r):

hint, A+B+C+D=0 implies A=-B-C-D

OpenStudy (zzr0ck3r):

Since it is not the case that all of the \(v_i\) are 0....

OpenStudy (rsadhvika):

\(Ax = x_1v_1+x_2v_2+...+x_nv_n=0\) \(x_1v_1=-x_2v_2-...-x_nv_n\)

OpenStudy (zzr0ck3r):

and for sure both sides are not 0 right?

OpenStudy (zzr0ck3r):

this shows they are not lin ind

OpenStudy (zzr0ck3r):

contradiction...

OpenStudy (rsadhvika):

None of the \(v_i\)'s are 0 but how do we know that the right hand is not 0 ?

OpenStudy (rsadhvika):

How do we know that some combination of linearly independent vectors doesn't evaluate to 0 ?

OpenStudy (zzr0ck3r):

well, we know the \(v_i\) are non zero and there is some \(x_i\) that is not zero. So \(x_iv_i=-x_1v_1-x_2v_2-...-x_{i-1}v_{i-1}-x_{i+1}v_{i+1}-...-x_nv_n\)

OpenStudy (zzr0ck3r):

now, since the left is not zero, the right is not zero

OpenStudy (zzr0ck3r):

The \(v_i\) are not zero because they are lin ind, there is some \(x_i\) that is not zero because \(x\ne 0\).

OpenStudy (rsadhvika):

Oh oh I think I see what you're saying... one moment

OpenStudy (rsadhvika):

If the right hand side evaluates to the 0 vector, then the coefficient xi on left hand side must be the number 0. Beautiful ! Thank you xD

OpenStudy (rsadhvika):

Something looks wrong, right hand side can evaluate to some other nonzero vector right ?

OpenStudy (rsadhvika):

A + B + C = 0 A = -B - C A = 2 B = -1 C = -1

OpenStudy (zzr0ck3r):

the left is non zero, so the right is not zero, so we have non zero multiples of \(v_i\) equal to scalar multiples of the other vectors

OpenStudy (zzr0ck3r):

so the v_i are not lin ind as assumed.

OpenStudy (zzr0ck3r):

2,-1,-1 are not lin ind vectors

OpenStudy (zzr0ck3r):

what is the problem

OpenStudy (zzr0ck3r):

?

OpenStudy (rsadhvika):

Ohk that proves the contrapositive : if a nontrivial solution exists for Ax=0, then the columns of A are not independent. Nice nice :)

OpenStudy (zzr0ck3r):

correct

OpenStudy (holsteremission):

It seems to me that more work is being done here than necessary (though admittedly, I've only invested a little over two minutes looking at the comments above). To be brief: If the columns of \(\mathbf{A}_{m\times n}\) are all mutually independent, then the columns span all of \(\mathbb{R}^n\) and \(\mathrm{rank}(\mathbf{A})=n\) (i.e. full column rank if \(m\neq n\), or simply full rank if \(m=n\)). The rank-nullity theorem (number of columns = rank + nullity) tells you the dimension of the nullspace of \(\mathbf{A}\) must be \(0\) and so only contains the zero vector.

OpenStudy (holsteremission):

Alternatively, if you're not familiar with the result of that theorem, you can think of it this way: given \(\mathbf{A}_{m\times n}\) with \(n\) independent columns, you know that \(\mathbf{A}\) can be row reduced with exactly \(n\) pivots, i.e. \[\mathrm{rref}(\mathbf{A})=\begin{bmatrix}p_1&\cdots&\cdots&\cdots\\ 0&p_2&\cdots&\cdots\\ \vdots&\vdots&\ddots&\vdots\\ 0&0&\cdots&p_n\\ 0&0&\cdots&0\end{bmatrix}\]With \(n\) pivot variables, there is no room for any free variables, which means there are no "special" vectors in the nullspace.

OpenStudy (zzr0ck3r):

Sure that works but you use way more "stuff" than we did here. The above got a little wacky but the following is the proof using definitions only. Suppose \(A\) is made of \(n\) independent vectors and suppose b.w.o.c. \(x\ne 0\) and \(Ax=0\) . \(\exists \ x_i \) s.t. \(x_i\ne 0\). Then \(x_1v_1+x_1v_2+...+x_nv_n=0\) so \(x_iv_i=-x_1v_1-x_2v_2+...+x_{i-1}v_{i-1}+x_{i+1}v_{i+1}+...+x_nv_n\). Since the \(v_i\ne0 \ \forall i\in[n]\), we have the left side is non zero and thus the right is also non zero. So \(v_i\) is a linear combination of the other vectors and thus a contradiction. i.e. This is basic work a 6th grader can understand, which I think is cleaner and easier than the proposed proof.

OpenStudy (holsteremission):

Fair enough, but the OP's use of "rank" is suggestive of knowing something about the R-N theorem. Ultimately up to him/her to decide which approach is easier.

OpenStudy (zzr0ck3r):

obviously

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!