Linear Algebra - picture attached
what i know is that since its invertible then there are M independant rows as well as N independent columns so a column must contain m, components of a vector
invertibility is a term valid only for square matrices so consider m=n
correct
to make it less confusing, lets rephrase the problem : If A is a mxm matrix, show that \(A^{-1} \implies Col(A) = \mathbb{R}^m\)
okay. that looks good
So basically you want to prove that if the inverse exists, the dimension of column space is \(m\)
yep! ive interpreted it like that but i am not sure how to start it
you're back!
By definition, `dimension` is the number of vectors in `basis` `basis` contains a full set of linearly independent vectors of a vector space. So we need to show that the `basis` of column space contains \(m\) linearly independent vectors : Since \(A\) is invertibel : \[Ax = 0 \iff x = A^{-1}0 = 0\] that precicely means all the \(m\) column vectors are linearly dependent. QED
so can we use that to answer b then?
so for b that means Col(A) cannot equal R^m without A being invertible because that means it won't be a square matrix (or detA=0)
\[Ax = 0 \iff x = A^{-1}0 = 0\] yes we can use this to jusstify bpth parts
\(Ax = 0 \iff x = 0\) is the definition of linearly independent column vectors
and if we let A not be invertible then we cannot end up with that previous statement and so Col(A) cannot equal R^m
one sec, im eating... typing with one hand
haha no worries
For part \(b\) : \(\large Col(A) = R^m \implies \) the column vectors of \(A\) span \(m\) dimensional space and form a basis and thus are `linearly independent`. So there will be \(m\) pivots when you row reduce and consequently the inverse exists.
go thru these related proofs when free : http://personalpages.manchester.ac.uk/student/joshua.dawes/notes/invertible-matrix.pdf
ah nice!! thanks for that. could you help with this prove. this involves orthnormal basis'
i should be fine with that proof we did now
need to review my notes, wil reply in 30 minutes
thanks man! youre such a great help!
@rational
\[AB = I \]
that means \(B\) is the inverse of \(A\), right ?
So we need to show : \[\large AA^{t} = I\]
YEP!
oh isn't there a proof in the link you gave me? Except it doesn't involve orthonormal basis
nupi i was mistaken
Say \(\large x_1, x_2, \cdots x_n\) are the orthonormal vectors of \(A\) Since the vectors are perpendicular, \(\large x_i . x_j = 0 \) if \(\large i \ne j\) Since the vectors are unit vectors, \(\large x_i . x_j = 1 \) if \(\large i = j\)
fine ?
yep! i got that far but thats it
next consider \(\large A A^{t}\)
\(\large [ AA^{t}] = [x_i . x_j] \)
when \(i=j\), the element equals 1 when \(i \ne j\), the element equals 0 giving you Identity matrix
where i=j is along the diagonals?
yes, \(\large AA^{t} = [x_i . x_j] = I \)
\[\large AA^{t} = I \] \[\large \implies A^{t} = A^{-1}\]
hmm. that seems quite short as a proof. i'm not sure if i'm making it more complex than it looks but is what you've written is suffice as a proof?
yes, you just need to state the definition of orthogonal and normal : 1) orthogonal \(\implies \) dot product of two distinct vectors = 0 \(\implies\) all elements other than diagonal are 0 2) normalized \(\implies\) magnitude of each vector = 1\(\implies\) dot product of two same vectors = 1 \(\implies \) all diagonal elements = 1
the arguments have to follow above logical order ^^
as you can see the proof is just about stating the definition and showing that \(AA^t = I\)
ah. i will keep reviewing what you have written again and again since its in my head! thanks so much for your help
np :) remember that `dot product of a \(i\)th row of \(A\) and \(j\)th column of B gives you the \(ij\)th element in the matrix \(AB\) : \[\large (AB)_{ij} = A_{ik} \bullet B_{kj}\]
and the ijth element is the diagonals of the matrix?
i=j is the diagonal element
oh yes. my bad!
oh your saying that when you dot A.B you just get all elements in AB ah
exactly ! to get the "row 1, column 2" element in AB : take "1st" row of A take "2nd" column of B do dot product
and this dot product equals 0 when \(i \ne j\) because the vectors are orthogonal
AHH! that explained it alot better! i have got it now! yay
Join our real-time social learning platform and learn together with your friends!