Linear algebra : Orthogonal transformation
textbook has :- Y = AX is said to be orthogonal if it transforms \(y_1^2 + y_2^2 + .... + y_n^2\) into \(x_1^2 + x_2^2 + ..... + x_n^2\)
It proves the sufficient condition as below :- Y= AX X'X = Y'Y = (AX)'(AX) = X'A'AX and concludes it is orthogonal only of A'A = I I am not able to get why they're equating X'X = Y'Y. appreciate any help...
Say X is a column vector, then X'X = x1^2 + x2^2 ... xn^2 Similarly, if Y is a col vector, then Y'Y = y1^2 + y2^2 ... yn^2 This follows from the dot product This also follows if you have multiple columns in X and Y, then you have a matrix So stating X'X = Y'Y at the start, assumes that the sum square of the two series (x's & y's) are equal and it follows that for this to be true, A'A must be the identity matrix. Note that X'X = Y'Y means that the sum of squares for the transformation preserve energy when going from one coordinate system (x's) to the other (y's). It's like rotating a single point about the origin (e.g. x-axis to y-axis). This rotation would not change the length of the point relative to the origin. This is similar to what this transformation is accomplishing, "A" serves simply to rotate a system of x's, preserving their distances to the origin. This is similar to what sine and cosine do to a point on the unit circle: If A= [cos(t) - sin(t), sin(t) cos(t) ], then AX is a rotation of X=(x,y) by "t" radians, note that cos^2(t) + sin^2(t) = 1 and each row is orthogonal (take dot product) and so is each column.
X is the unknown variables column vector : X = x1 x2 x3 ... xn Y is the Function column vector Y = y1 y2 y3 ... yn Y = AX A is the transformation matrix
im still not getting, why they're saying X'X = Y'Y its same as saying, y1^2 + y2^2 + .... = x1^2+x2^2 + .... right ?
orthogonal transformation MAPS y^2 into x^2 it doesnt mean y^2 equal x^2 right ? sorry just this is grey area for me
yes it is, for example if X = [1,2,3] (col vector) with X' = [1 2 3] (row vector) then X'X = (1,2,3)(1,2,3)= 1^2 + 2^2 + 3^2 (dot product), similar for y You are right, it doesn't mean y^2 = x^2, just the sum square are equal
Starting with X'X = Y'Y is the same as saying y1^2 + y2^2 + .... = x1^2+x2^2 + .... like you said. It's stating the assumption and then you are led to the Identify matrix as the condition for this to be true
oh i see it but X'X = Y'Y is definition of orthogonal transformation is it ?
I kindof get everything after X'X = Y'Y, how they got A'A = I just im not getting why they equating X'X = Y'Y
how thats related to the definition of orthogonal transformatio
X'X = Y'Y is the assumption. That is, it's the given.
one sec please, il take a pic of my text and post
Any orthogonal transformation preserves energy (i.e. sum squares are equal)
oh so X'X = Y'Y is the starting point is it
yes, then you see what it takes to make this true
nice thank you it makes sense now xD
new things are hard sometimes
yeah ive just started linear algebra few weeks back... today started with linear transformations stuck on this for like hours
how to get the concept of energy ? sums of squares equal ...
that looks like a cool thing to ponder over
look at some popular transformations, like the discrete fourier transform or wavelet transforms
wooh ! not yet, guess i need to wait till i get to those topics... to understand this energy thing perfectly
when transforms occur, for example on images (e.g. jpeg, gif), there is often a loss of energy, but the high frequency content (e.g. details of the image) are where most of the information is contained, so energy does not need to be preserved. Transformation then are called "lossy" in the image transformation world. But this is necessary for compression, so compression is a loss of energy, sum of squares are not equal to the original images
think of each pixel as a coordinate (x,y)
oh so we miss out some pixels when we compress is it then its not like the XY plane transformation : x' = xcostheta + y sintheta, y' = -xsintheta + ycostheta
Think of energy as information. Time signals can be represented perfectly by transforming them into their frequency content, preserving all information.
then we cant call it a linear transformation ?
image compression with lossy data is not a linear transformation
the matrix operations are linear, but we might decide to through out some rows or columns in the resulting matrix, to "compress" after the orthogonal operation
throw*
i see... that makes me scared... so much to learn it never gets completed :s
If you can throw out some rows, then you only save whatever is left over, saving space but keeping most of the 'information"
they teach these in engineering is it ? cuz in my math book they seem to conclude after eigen vectors
yes, some applications classes. But everything you learn in linear algebra basics is all you need to understand these applications, it's not hard just keep looking for applications. Linear algebra is not a broad subject.
note that eig vectors are independent but not necessarily orthogonal
thats comforting, thank you :) ohk... thats a different topic it seems
gotta run, appreciate your help very much... may i tag you next time i get stuck ?
maybe covered in graphics development, but I picked this up on my own after learning lin algebra. I have an MSEE so I have quite a lot by now
ok, good luck
yes, you can tag
wow - my next topic is linear algebra - looks tricky to me!!
one more thing @rsadhvika, orthogonality is typically defined as integral y(x)*z(x) = 0. The energy terminology I was trying to recall was Parseval's Identity - http://en.wikipedia.org/wiki/Parseval%27s_identity
Join our real-time social learning platform and learn together with your friends!