Find a non zero square matrix such that the row space is perpendicular to the column space
Head... hurts... I have not done this in years. Did you mean "orthogonal" when you said "perpendicular" (I thought perpendicular was only for geometry?)?
In other words , Every row vector must be perpendicular to every column vector
Yea my textbook uses both the words orthogonal and perpendicular interchangeably
can you show how to solve this?
Yea I'm still trying It shouldn't be hard, but I dont seem to be having it yet..
If a row and a column are orthogonal, what is their dot product? Zero, right?
Right, looks I need to solve a system of four unpleasant equations
So \[A*A ^{T} = 0\]
Ahh that looks nice. Shouldn't it be A*A = 0 ?
Yeah, maybe so. Like I said, it has been years.
Unless I'm missing something obvious, you can come up with a pretty trivial example, say \(\begin{bmatrix}1&0\\0&0\end{bmatrix}\).
First row is not orthogonal to the first column in your matrix
Right, but when is any nonzero vector orthogonal to itself?
Never
Right, so that just means in\[\left[\begin{matrix}a & b \\ c & d\end{matrix}\right]\] that \[b \neq c\]
Nice! We can make b=1 and rest 0
That works! Thank you both :)
We could also make c=1 and everything else 0
Fun thing is matrix multiplication between two matrices: \[XY=Z\] is the same as taking the column vectors of Y and taking their dot product with the row vectors of X. :) Since you want the dot products of all the column vectors of A with the row vectors of A to be orthogonal, this is the same thing as: \[AA=I\] So \(A\) is necessarily its own inverse and a square root of the identity matrix! :D
So other than \(I\) and \(-I\) you could also have \(A\) be any of many different permutation matrices. I wonder there's probably other possibilities too.
row.col = 0 should give the equation AA = 0 right ?
Oh ok yeah, good point. I was thinking the diagonal would end up with vectors dot producted with themselves, then yeah it would have to equal 0.
In my mind I was thinking \(A^\top A = I\) which is more common to me to do, lol
It does give an interesting eqn A^2 = 0 shows how the square of some nonzero thingy can be 0. Matrices are weird
Looks like upon scrolling up someone already said this and I missed it completely lol.
If you think of matrices as functions, it's not so crazy. For instance let's consider this situation. I have a thing that takes two kinds of fruits, apples and bananas and it destroys apples and turns bananas into apples in one crank of the lever. In two cranks of the lever, all the fruit must be gone.
If I hadn't known those matrices, I would have concluded that A^2 = 0 implies A=0
By functions you mean transformations ?
Yeah, linear transformations are mappings, just functions that take you from a domain to some range. At least this is probably one of the most important perspectives of matrices imo. Matrix multiplication is function composition, I think that's literally true.
Yeah many proofs like associativity law etc become trivial using matrices to represent linear transformations
Actually you've seen ordinary differential equations before, you know how sometimes you end up with repeated solutions, so you end up having to multiply one of the solutions by x? \[y''+2y'+y=0\]\[D^2y+2Dy+Iy=0\]\[(D+I)^2 y = 0\] Now you have something of the form \(A^2=0\)
So uh I guess I should finish up by saying how it ends up, you can either look at solutions to: \[(D+I)y=0\] but we also have the possibility this operation on y doesn't get 0, so we get some other thing f. \[(D+I)y=f\] Now we do require \[(D+I)^2y = (D+I)f=0\] So this is where that extra "x" multiplying y comes in because \(\frac{d}{dx} ax = a\) and \(\frac{d^2}{dx^2} ax = 0\)... It's really not so bad I left many gaps here just cause I didn't wanna bog it down but it's the same thing going on here.
Nice y" = 0 need not imply y' = 0
Join our real-time social learning platform and learn together with your friends!