Ask your own question, for FREE!
Mathematics 10 Online
OpenStudy (anonymous):

Linear Algebra: Consider orthogonal unit vectors v1->vm in Rn. Show they are necessarily linearly independent. Says to use dot product in proof, but not really sure of an attack strategy here

OpenStudy (anonymous):

The basic way to show vectors are independent is to show their linear combination is 0, and the only such coefficients are all 0. So \[c_1v_1 + c_2v_2 + ... + c_mv_m = 0 \] and we aim to prove all c_n are 0

OpenStudy (anonymous):

From the above, we can take any one of {v1, v2...vm}, and dot it with both sides: \[v_i \cdot (c_1v_1 + c_2v_2 + ... + c_mv_m) = v_i \cdot 0\]

OpenStudy (anonymous):

The dot product is distributive, so we can dot v_i with each term separately, and on the right side we have 0. Once we distribute v_i throughout, however, every term becomes 0, except the term that matches the vector we used to dot it with: \[v_i \cdot (c_1v_1 + c_2v_2 + ... + c_mv_m) = 0\] \[c_iv_i \cdot v_i\]

OpenStudy (anonymous):

(forgot the "= 0" above) The above happens because all vectors are orthogonal, and two orthogonal vectors' dot product are 0, and v_i dot v_i is nonzero, so c_i must be 0. This can be repeated with all vectors, yielding all c are 0, and that's the proof.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!