determine whether the following are linearly indepent or not alpha=(1,-1,0) ; beta=(1,3,-1) ; gamma=(5,3,-1)
the first one is alpha =(1,-1,0)
Are these vectors or points?
alpha, beta, gamma usually denote angles.
vectors
A vector is linearly dependent iff one can be expressed as a multiply of another given vector, for instance \[ \Large \vec{a}=c\vec{b}, \text{where c is scalar} \] are two linearly dependent vectors.
You maybe also hear or read about redundant vectors, which means that one vector just doesn't carry any additional informations.
do you know how to do elimination?
yes
If you get a row with all zeros, then the 3 vectors are dependent. if you get 3 pivots they are independent
but a vector is linearl independent if all the scalars are equal to 0. but i don't get that.
ie let x,y,z be scalars , then by using methods (e.g gramer 's etc) to solve ,i must get everything as zero .
I believe you could setup a \( 3 \times 3 \)-Matrix and then check it with the determinant method, if you get determinant = 0 then they are linearly dependent.
but if I am not completely mislead by my intuition, then this vectors are linearly independent anyway.
think of it as solving \[\left[\begin{matrix}1 & -1 & 0\\ 1 & 3 & -1 \\ 5 &3 & -1\end{matrix}\right]\left[\begin{matrix}a \\ b\\c\end{matrix}\right]=\left[\begin{matrix}0 \\ 0\\0\end{matrix}\right]\]
@Spacelimbus if my det =0 my scalars will be undefined because anything divide by zero is undefined
elimination is the easiest way to solve, and if you get 3 pivots, the c must be 0 and this will force b to be 0, and then a to be 0 (when you backsolve) so 3 pivots means 3 independent vectors.
I maybe shouldn't have said determinant method, the determinant isn't what you divide by. \[\left(\begin{matrix}2 \\ 3\end{matrix}\right) \text{and} \left(\begin{matrix}4 \\ 6\end{matrix}\right)\] are linearly dependent because their determinant is 0.
\[\det \left[\begin{matrix}2 & 4 \\ 3 & 6\end{matrix}\right]=0\]
yes, in the textbook they are saying it linearly dependebd since we can find infinitely many solution . i don't how do they got this answer.
@Spacelimbus are using my points or yours
they would be dependent if gamma were (5,3,-2)
linearly dependent? \[1\left[\begin{matrix}3 & -1 \\ 3 & -1\end{matrix}\right]+1\left[\begin{matrix}1 & -1 \\ 5 & -1\end{matrix}\right]+0\left[\begin{matrix}1 & 3 \\ 5 & 3\end{matrix}\right]\]
not sure yet if I am missing something.
they are saying one such solution is 3(alpha)+2(beta)-gamma=(0,0,0)
that means gamma must be (5,3,-2) so maybe a typo in the question as you start with gamma= (5,3 ,-1)
yes @phi, then my method would work too. \[1\left[\begin{matrix}3 & -1 \\ 3 & -1\end{matrix}\right]+1\left[\begin{matrix}1 & -1 \\ 5 & -2\end{matrix}\right]+0\left[\begin{matrix}1 & 3 \\ 5 & 3\end{matrix}\right]\]
determinant is zero, therefore linearly dependent.
\[ 1\det\left[\begin{matrix}3 & -1 \\ 3 & -2\end{matrix}\right]+1\det\left[\begin{matrix}1 & -1 \\ 5 & -2\end{matrix}\right]+0\det\left[\begin{matrix}1 & 3 \\ 5 & 3\end{matrix}\right]\] Correction*
@Spacelimbus wat if my det is not zero but my scalars a all zero wat will happen
they are saying one such solution is 3(alpha)+2(beta)-gamma=(0,0,0) if we arrange the vectors as columns in a matrix, \[\left[\begin{matrix}1 & 1 & 5 & | & 0\\ -1 &3 & 3&|&0\\0& -1&-2&|&0\end{matrix}\right]\] now do elimination to reduced row echelon form
vector times scalar when scalar is zero will also equal to zero. But I don't see why I would want to include scalar identities for that, if I read that correctly, all we want to check is if or if not the given vectors are linearly dependent, For the method via a Matrix.
I have to disappear for a moment, but @REMAINDER, did you check already if the given points above in your question are correct?
@Spacelimbus ok
we would get \[\left[\begin{matrix}1 & 0 & 3 & | & 0\\ 0 &1 & 2&|&0\\0& 0&0&|&0\end{matrix}\right]\] first we see there are 2 pivots (not 3) so linearly dependent second the 3rd column tells us that 3 * alpha (the 1st vector) + 2* beta will equal the 3rd vector gamma
yes i also got that wat do u mean beta will equall the 3rd vector gamma
the third column of the rref matrix (3,2,0) tells us that 3*alpha + 2*beta= gamma
@beketso check this
@Spacelimbus wat about this one\[\left\{ -1+x-x ^{2};2+x+x^{2};1-x+x^{2};1+x\right\}\]
Interesting notation, are these supposed to be functions in dimensions? first one would be the \(i\) component, second one the \(j\) component, third one the \(k\) and last one the \(m\) component?
yes
I would say linearly depending, because the \(i\) component added to the \(k\) component gives you \[ 0+ 0 + 0 \]
It looks like they are asking if these polynomials are independent, and as space points out the 3rd polynomial is -1* first polynomial so this set is dependent. we can still use elimination to show the other 3 are independent.
otherwise, I would maybe consider using the Wronski Determinant.
oh by the way @phi in the first question at the last row wat if the last pivot was having a 1 ,wat i'm i gonna say to my answer.
If you get 3 pivots (using the original gamma = (5,3,-1) that would happen) then the 3 vectors are independent. If I were answering this, I would answer it both ways (w the correct gamma), and ask for extra credit!
@experimentX check this
huh ... where are we?
try to determinate the rank of that matrix or try to reduce it to reduced row echelon form ... try pluggin into wolf
let me try one another thing
walf it gives me the inverse
of what? these vectors are linearly independent
of the original matrix
yep ...
the answer from the textbook is that it is linearly dependent
well ... there must be some typo ... check this ... did I make any error input http://www.wolframalpha.com/input/?i=Determinant%5B%7B%7B1%2C-1%2C0%7D%2C%7B1%2C3%2C-1%7D%2C%7B5%2C3%2C-1%7D%7D%5D
@beketso check this
@REMAINDER it is clear, based on the textbook answer, that the question as posed has a typo. They give gamma as (5,3,-1) and they meant (5,3,-2) (math books often have typos, so this is not surprising)
@phi i agree with u it is that wolf confused me
@REMAINDER ,remember,vectors are linearly dependent if they have a non-trivial linear relation.based on that i say ur given vectors are linearly dependent.
Here is wolf's answer using the correct gamma http://www.wolframalpha.com/input/?i=rref%5Btranspose%7B%7B1%2C-1%2C0%7D%2C%7B1%2C3%2C-1%7D%2C%7B5%2C3%2C-2%7D%7D%5D
and its determinant is 0 (indicating the matrix is non-invertible, and so has linearly dependent columns (or rows))
so @phi .frankly,that means if the rows/columns are linearly independent then the vectors are also linearly dependent?
Join our real-time social learning platform and learn together with your friends!