Prove H is a Hessenberg matrix.
Given a vector x. Consider matrix \[K=\left[\begin{matrix}| & | & | & & | \\ x & Ax & A^2x & ... & A^{n-1}x\\ | & | & | & & | \end{matrix}\right]\] and assume that the columns of this matrix are linearly independant. Also consider the QR-factorization K = QR of matrix K. Prove that \[H = Q^TAQ\] is a Hessenberg matrix.
.
This is an interesting question but I don't know the answer. I think it has something to do with the fact that R is a triangular matrix. I was playing around with K, since it seems kind of interesting that multiplying K by A sorta permutes the columns. Not sure, but there's some kinda thing going on with the A's. I'm gonna eat some breakfast and then see. Maybe in trying to compute the QR factorization we'll get some interesting things happening between the powers of A and the successive Gram-Schmidt orthogonalization process going on?
We have to sort of show that the coefficients show that it is a hessenberg, so that the first column satisfies the hessenberg lay-out, then the second, and so on.. Can't figure it out though
I have no idea either, but: \[R^{-1}HR=R^{-1}Q^TAQR=K^{-1}AK\] \[HR=RK^{-1}AK\] A Hessenberg matrix multiplied by an upper triangular matrix is again an Hessenberg matrix, so it might be the case that \(K^{-1}AK\) is a Hessenberg matrix too... it might be a dead end though :P
Join our real-time social learning platform and learn together with your friends!