Give the Projection of a vector onto a subspace Question: https://imgur.com/Tj3dUiC Totally confused on where to start with this, anyone able to help me understand how to answer this question?
This link seems to have a few examples: http://www.cliffsnotes.com/math/algebra/linear-algebra/real-euclidean-vector-spaces/projection-onto-a-subspace
So to start, you have \[\begin{align*}\large \text{proj}_UX&=\large\sum_{i=1}^3\text{proj}_{X_i}X\\\\ &=\large\text{proj}_{X_1}X+\text{proj}_{X_2}X+\text{proj}_{X_3}X\\\\ &=\frac{X\cdot X_1}{X_1\cdot X_1}X_1+\frac{X\cdot X_2}{X_2\cdot X_2}X_2+\frac{X\cdot X_3}{X_3\cdot X_3}X_3 \end{align*}\] \[X\cdot X_1=(3,1,0,42)\cdot(1,0,-1,-1)=-39\\ X\cdot X_2=(3,1,0,42)\cdot(2,1,1,1)=49\\ X\cdot X_3=(3,1,0,42)\cdot(-1,3,-1,0)=0\] So, \[\text{proj}_UX=-39X_1+49X_2=\begin{bmatrix}59\\49\\88\\88\end{bmatrix}\]
You want the orthogonal projection, which is given by \[X_{\perp U}=X-\text{proj}_UX\] and that's \[X_{\perp U}=\begin{bmatrix}3\\1\\0\\42\end{bmatrix}-\begin{bmatrix}59\\49\\88\\88\end{bmatrix}=\begin{bmatrix}-56\\-48\\-88\\-46\end{bmatrix}\] Hmm, that doesn't look entirely right to me...
So I got, Based on the Best approximation theorm, Where, \[Proj_UX = \frac{X·X_1}{X_1·X_1}X_1 + \frac{X·X_2}{X_2·X_2}X_2 + \frac{X·X_3}{X_3·X_3}X_3\] \[\left(\begin{matrix}1 \\ 7\\ 20\\ 20\end{matrix}\right) = Proj_UX\] Does this seem correct?
Here is the theorem
Also I computed the error as, \[||X - proj_uX|| = \sqrt{924}\]
Hmm, I'm afraid I couldn't say one way or the other. My memory of linear algebra is fading a lot faster than I'd anticipated. In any case, I don't think I've ever seen that theorem before. Sorry I can't help :/
Its alright, I looked it up and found a textbook that gave an example. Thanks anyways :)
Join our real-time social learning platform and learn together with your friends!