Consider a row matrix [1 4 5] Multiply the row by 2 to form the second row [1 4 5] [2 8 10] The rows are multiples of each other. Now look at columns, they are also multiples of each other. What is an intuitive/physical way to digest this ?
row rank is column rank :)
Exactly! These are rank1 matrices So here is the fact : If the rows of a matrix are multiples of each other, then the columns will also be multiples of each other. Im looking for examples from real world, if possible, to better understand this result...
I've played with it some [a b c] [ka kb kc] Second column can be obtained by multiplying the first column by b/a. Similarly third column can be obtained by multiplying the first column by c/a. But that is just algebra. I think I need a physical example to digest this fully...
in your last matrix @ganeshie8 you can divide the second column over the first column row1: b/a row2: kb/ka = b/a where k is nonzero so the two ratios are equal
the same applies for the first and third columns row1: c/a row2: kc/ka = c/a
Yes, really nice... Im trying to cookup an example, to present this fact, that looks less abstract and more intuitive ...
That shows column vectors can be obtained by scaling. All the column vectors point in the same direction
But why is that related to the row vectors ? I'm really looking for a real world example to show this relationship between row vectors and column vectors
Also, I found this last week when I was wondering the same thing. No great answers as far as what I was looking for, but maybe for you... http://math.stackexchange.com/questions/332908/looking-for-an-intuitive-explanation-why-the-row-rank-is-equal-to-the-column-ran
I have a firm grip on what column vectors do, and their space, and rank and .... but not at all the same for rows in a matrix.
Very interesting replies in that thread, thanks @zzr0ck3r I think incidence matrices are the best way to kill both row vectors and column vectors in one problem. Rows represent edges and columns represent nodes. In an electrical circuit graph, Ax can represent the potential differences and A^ty can represent the currents entering a node. I think you knew more about graphs than me..
this isn't hard to understand you do this all the time when you simultaneously solve solutions. In real life, you can think about the total number of items from say apples, oranges and bananas. If k1, k2 and k3 represent the cost of each item respectively the total cost is just C = k1A+k2O+k3B where A, O and B are numbers of apples, oranges and bananas. Now let's say we pay n times as much as we did. The new cost is nC = n(k1A+k2O+k3B) = k1nA + K2nO + K3nB This is the same as saying we are buying n times more of each individual item to make n times the cost.
Brilliant! But my problem is slightly different. In your example : Cost per item [k1 k2 k3] Number of items purchased [A] [B] [C] Multiplying them together gives the overall cost. This is dot product. scaling either the cost vector or the quantity vector scales the resulting dot product accirdingly. This doesn't explain how the dependence of row vectors implies the dependence of column vectors.
My specific question is this [k1 k2 k3] Add another row by multiplying the first row by n [k1 k2 k3] [nk1 nk2 nk3] Here, by construction, the rows are multiples of each other. Looking at the columns, we see that columns are also multiples of each other! Why ? This is my specific question. Ive figured out the answer after doing a little algebra. But im still looking for a real example, like yours, that highlights this particular fact. Your example is about dot products. I couldn't relate it directly to my problem...
Perhaps thinking about transposes may help somehow
Yeah, often you use graph theory to solve linear algebra problems and linear algebra to solve graph theory problems.
I think of matrices as functions, they're linear maps. If you have a 3x2 matrix mapping from a 3D space to a 1D subspace of 2D then you can at most uniquely invert the map to get a 2x3 matrix which maps from a 1D subspace of 2D to a 1D subspace of 3D space.
This is why I understand the rank of the column and row space as being the same, intuitively speaking. :P
Do you mean [k1 k2 k3] [nk1 nk2 nk3] can be factored as [1 ][k1 k2 k3] [n ] ? I feel this is still very abstract hmm
Join our real-time social learning platform and learn together with your friends!