Ask your own question, for FREE!
Mathematics 19 Online
OpenStudy (anonymous):

Linear Algebra - picture attached

OpenStudy (anonymous):

OpenStudy (anonymous):

what i know is that since its invertible then there are M independant rows as well as N independent columns so a column must contain m, components of a vector

OpenStudy (rational):

invertibility is a term valid only for square matrices so consider m=n

OpenStudy (anonymous):

correct

OpenStudy (rational):

to make it less confusing, lets rephrase the problem : If A is a mxm matrix, show that \(A^{-1} \implies Col(A) = \mathbb{R}^m\)

OpenStudy (anonymous):

okay. that looks good

OpenStudy (rational):

So basically you want to prove that if the inverse exists, the dimension of column space is \(m\)

OpenStudy (anonymous):

yep! ive interpreted it like that but i am not sure how to start it

OpenStudy (anonymous):

you're back!

OpenStudy (rational):

By definition, `dimension` is the number of vectors in `basis` `basis` contains a full set of linearly independent vectors of a vector space. So we need to show that the `basis` of column space contains \(m\) linearly independent vectors : Since \(A\) is invertibel : \[Ax = 0 \iff x = A^{-1}0 = 0\] that precicely means all the \(m\) column vectors are linearly dependent. QED

OpenStudy (anonymous):

so can we use that to answer b then?

OpenStudy (anonymous):

so for b that means Col(A) cannot equal R^m without A being invertible because that means it won't be a square matrix (or detA=0)

OpenStudy (rational):

\[Ax = 0 \iff x = A^{-1}0 = 0\] yes we can use this to jusstify bpth parts

OpenStudy (rational):

\(Ax = 0 \iff x = 0\) is the definition of linearly independent column vectors

OpenStudy (anonymous):

and if we let A not be invertible then we cannot end up with that previous statement and so Col(A) cannot equal R^m

OpenStudy (rational):

one sec, im eating... typing with one hand

OpenStudy (anonymous):

haha no worries

OpenStudy (rational):

For part \(b\) : \(\large Col(A) = R^m \implies \) the column vectors of \(A\) span \(m\) dimensional space and form a basis and thus are `linearly independent`. So there will be \(m\) pivots when you row reduce and consequently the inverse exists.

OpenStudy (anonymous):

OpenStudy (rational):

go thru these related proofs when free : http://personalpages.manchester.ac.uk/student/joshua.dawes/notes/invertible-matrix.pdf

OpenStudy (anonymous):

ah nice!! thanks for that. could you help with this prove. this involves orthnormal basis'

OpenStudy (anonymous):

i should be fine with that proof we did now

OpenStudy (rational):

need to review my notes, wil reply in 30 minutes

OpenStudy (anonymous):

thanks man! youre such a great help!

OpenStudy (anonymous):

@rational

OpenStudy (rational):

\[AB = I \]

OpenStudy (rational):

that means \(B\) is the inverse of \(A\), right ?

OpenStudy (rational):

So we need to show : \[\large AA^{t} = I\]

OpenStudy (anonymous):

YEP!

OpenStudy (anonymous):

oh isn't there a proof in the link you gave me? Except it doesn't involve orthonormal basis

OpenStudy (anonymous):

nupi i was mistaken

OpenStudy (rational):

Say \(\large x_1, x_2, \cdots x_n\) are the orthonormal vectors of \(A\) Since the vectors are perpendicular, \(\large x_i . x_j = 0 \) if \(\large i \ne j\) Since the vectors are unit vectors, \(\large x_i . x_j = 1 \) if \(\large i = j\)

OpenStudy (rational):

fine ?

OpenStudy (anonymous):

yep! i got that far but thats it

OpenStudy (rational):

next consider \(\large A A^{t}\)

OpenStudy (rational):

\(\large [ AA^{t}] = [x_i . x_j] \)

OpenStudy (rational):

when \(i=j\), the element equals 1 when \(i \ne j\), the element equals 0 giving you Identity matrix

OpenStudy (anonymous):

where i=j is along the diagonals?

OpenStudy (rational):

yes, \(\large AA^{t} = [x_i . x_j] = I \)

OpenStudy (rational):

\[\large AA^{t} = I \] \[\large \implies A^{t} = A^{-1}\]

OpenStudy (anonymous):

hmm. that seems quite short as a proof. i'm not sure if i'm making it more complex than it looks but is what you've written is suffice as a proof?

OpenStudy (rational):

yes, you just need to state the definition of orthogonal and normal : 1) orthogonal \(\implies \) dot product of two distinct vectors = 0 \(\implies\) all elements other than diagonal are 0 2) normalized \(\implies\) magnitude of each vector = 1\(\implies\) dot product of two same vectors = 1 \(\implies \) all diagonal elements = 1

OpenStudy (rational):

the arguments have to follow above logical order ^^

OpenStudy (rational):

as you can see the proof is just about stating the definition and showing that \(AA^t = I\)

OpenStudy (anonymous):

ah. i will keep reviewing what you have written again and again since its in my head! thanks so much for your help

OpenStudy (rational):

np :) remember that `dot product of a \(i\)th row of \(A\) and \(j\)th column of B gives you the \(ij\)th element in the matrix \(AB\) : \[\large (AB)_{ij} = A_{ik} \bullet B_{kj}\]

OpenStudy (anonymous):

and the ijth element is the diagonals of the matrix?

OpenStudy (rational):

i=j is the diagonal element

OpenStudy (anonymous):

oh yes. my bad!

OpenStudy (anonymous):

oh your saying that when you dot A.B you just get all elements in AB ah

OpenStudy (rational):

exactly ! to get the "row 1, column 2" element in AB : take "1st" row of A take "2nd" column of B do dot product

OpenStudy (rational):

and this dot product equals 0 when \(i \ne j\) because the vectors are orthogonal

OpenStudy (anonymous):

AHH! that explained it alot better! i have got it now! yay

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!