Is there an easy proof for \[\large \sum\limits_{x=0}^n\sum\limits_{y=0}^m f(x)g(y) = \left(\sum\limits_{x=0}^n f(x)\right) \left(\sum\limits_{y=0}^m g(y)\right) \]
I found this breakup very useful in solving couple of problems previously... just wondering if this can be proven easily..
for example \[\large \begin{align}\sum\limits_{x=0}^2\sum\limits_{y=0}^3xy &= \left(\sum\limits_{x=0}^2 x\right) \left(\sum\limits_{y=0}^3 y\right)\\~\\& = (0+1+2)(0+1+2+3) \\~\\&= (3)(6)\\~\\&=18\end{align}\]
And yes this looks analogous to iterated integrals \[\large \int_0^n\int_0^m f(x)g(y)dydx = \left(\int_0^n f(x) dx\right)\left(\int_0^m g(y) dy\right)\]
we can write:
so the order of the sums is indifferent
Nice i was trying to expand like that... i think that shows \[\large \sum\limits_{x=0}^n\sum\limits_{y=0}^m f(x)g(y) =\sum\limits_{y=0}^m\sum\limits_{x=0}^n f(x)g(y) \]
yes! I think so!
the variables \(x,y\) are independent
I don't see why a proof is necessary since multiplication distributes over addition.
I'm not so sure if it can be so obvious to everyone because I had to struggle a bit initially while using this hmm
Michele used the distributive property : \[\begin{align}\sum\limits_{x=0}^n\sum\limits_{y=0}^m f(x)g(y) &= f(0)g(0) + f(0)g(1)+\cdots + f(1)g(0) + f(1)g(1)+\cdots \\~\\ &= \left(f(0)+f(1)+\cdots\right)g(0) + \left(f(0)+f(1)+\cdots\right)g(1) + \cdots\\~\\ &= \left(\sum\limits_{x=0}^n f(x)\right)(g(0)+g(1)+\cdots + g(m))\\~\\ &= \left(\sum\limits_{x=0}^n f(x)\right) \left(\sum\limits_{y=0}^m g(y)\right)\\~\\ \end{align}\] simply saying below is also okay to me as i have some familiarity with double sums already \[\begin{align}\sum\limits_{x=0}^n\sum\limits_{y=0}^m f(x)g(y) &= \sum\limits_{x=0}^nf(x) \sum\limits_{y=0}^m g(y)\\~\\ &= \left(\sum\limits_{x=0}^n f(x)\right) \left(\sum\limits_{y=0}^m g(y)\right)\\~\\ \end{align}\]
Sigma notation is just addition. Without sigma notation we can add any values we want and they don't have to follow any pattern, but we're unable to change the total amount of values we're adding. Sigma notation lets us make a trade off. We can now have a variable amount of values that we're adding up, but we have to be stricter in our choices of values we want to add, they have to follow a pattern and be added consecutively. Linear algebra is related to this in that the values we're adding up don't have to follow a pattern, but they are ordered in a reference table somewhere.
I guess that's not very helpful... If we look at a concrete example it will probably make more sense: \[\Large \sum_{x=0}^n \sum_{y=0}^m x^3ye^{x+y^3}\] If we have this ridiculous thing, just separate it \[\Large \sum_{x=0}^n \sum_{y=0}^m( x^3e^{x})(ye^{y^3})\] When you add up values of y, you will just have this function of x multiplying every term of your summation. So you just factor it out since it is nothing more than a summation: \[\Large \sum_{x=0}^n( x^3e^{x}) \sum_{y=0}^m(ye^{y^3})\] The problem seems to me is that you're just uncomfortable with the notation because it's so abstract... which is understandable, it's ridiculously abstract haha. Now you may have some problems with this, it seems like incomplete in some way. It is asymmetrical to pull f(x) out of a sum of g(y) terms. But if we go back to the general formula with f(x) and g(y) you could completely exchange f and g, x and y, and n and m and no one would be able to tell the difference. This is symmetry (as opposed to skew symmetry) and it allows us to basically say if we could factor f(x) out of the sum of g(y) then we necessarily can factor g(y) out of a sum of f(x). They have the same exact form, so this is what lets you safely lock parentheses around both summations separately from each other.
Join our real-time social learning platform and learn together with your friends!