Derivatives with respect to two variables at the same time.
I was looking over complex differentiation, and realized that differentiation with respect to the complex plane is essentially differentiation with respect two two variables simultaneously, which is not something we ever see in vector calculus.
With complex differentiation, we say: \[ f'(z) = \lim_{\Delta z\to 0}\frac{f(z+\Delta z)-f(z)}{\Delta z} \]
It looks exactly like real differentiation, but what makes it special is the fact that \(\Delta z\) is a vector, in a way.
we can think of complex functions as \(f:\mathbb R^2\to \mathbb R^2\)
We can have such a function in vector calculus, in fact we can think of the change of variables as being similar.
But even when a vector function maps from \(\mathbb R^2\) to \(\mathbb R^2\), we typically only differentiate with one variable at a time.
The only thing that complex arithmetic has that vector arithmetic seems to lack is a multiplication between two vectors size \(\mathbb R^n\) that results in a vector in \(\mathbb R^n\), and thus a division operation.
So let me give an example... \[ f(x,y) = \langle x,y\rangle \]
We can think of this as a complex function: \[ f(z) = z \]Where \(z=\langle x,y\rangle\)
And clearly we have \(f'(z)=1\)... or you could say \(f'(x,y) = \langle 1,0\rangle \)
To be clear, I'm saying \(z=x+iy = \langle x,y\rangle\)
I'm not so sure I completely agree with this result, something seems fishy here lol
Which result?
For complex numbers, the multiplication operation we are using is: \[ \langle a,b \rangle \circ \langle c,d\rangle = \langle ac-bd,ad+bc \rangle \]
Just the derivative, I guess I don't really understand what it means to take the derivative with respect to a complex variable, it's weird.
z/w = z w*/|w|^2 is equivalent to what you're saying, right?
It's weird because we never do this in vector calc. We don't have any sense of dividing a vector by another.
Yeah, I think so.
\[\Large \frac{z}{w}=\frac{z w^*}{ww^*} = \frac{zw^*}{|w|^2}\] Which is essentially how division is done in geometric algebra, so I suppose it is pretty much the same interpretation whatever that is, except this is commutative. So I suppose that division is really dividing one vector by the other one's length and "unwinding" by rotating backwards by that amount at the same time.
It's more obvious from this perspective:\[\Large z= r e^{i \theta}\] We are just dividing the length as normal but we are also rotating backwards by the amount the vector was from the origin. So we still have the original idea of division built in it seems.\[\Large \frac{1}{z} = \frac{1}{re^{i \theta}}=\frac{1}{r}e^{-i \theta}\]
We still end up with \(zw^*\) being a vector though. So we need multiplication some how. Though I think division by a scalar is much more trivial than division by a vector.
Since in 3 dimensions you are still just dealing with 2 vectors in space, they lie within their own plane so I think we can probably take this idea with us.
I suppose I am not against a multiplication that results in a multivector in the future, it's just for now I want to consider the \(\mathbb R^n\to \mathbb R^n\) case.
Oh I'm not really even saying anything about multivectors. Complex division is the same as just dividing another value by the length of the complex number and then rotating backwards by the amount of the vector is rotated away from the origin.
Yeah, I'm not completely sure how complex differentiation results in anything interesting
But they're teaching people about it for some reason or another
Actually, now that I think about it... I'm thinking about the \(\mathbb R^k\to \mathbb R^n\) where \(k \leq n\).
We know how to do this when \(k=1\), and complex differentiation is doing \(k=2\)
This is interesting, apparently the Cauchy-Riemann equations are the relationship between the real and complex derivative of \[\Large f(x + iy) = u(x, y) + iv(x, y)\] \[\Large \frac{ \partial u }{ \partial x} = \frac{\partial v}{\partial y} \\ \Large \frac{\partial u}{ \partial y} = - \frac{ \partial v}{\partial x}\]
What do those equations mean?
Are they just always true?
\[ f(z) = z^n \implies f'(z) = nz^{n-1} \]
One thing about complex multiplication that is special is that it has the properties necessary to make differentiation rules work
They are apparently always true, but digging deeper there are some more interesting things.
Another thing about about complex derivatives is that you can think about them as being like a single variable derivative so long as the entire function can be parametrized into \(z\).
For example, \(f(x,y) = \langle x,\sin(y)\rangle \implies f(z)= \Re (z)+i\sin(\Im (z))\) . We don't really have a clean function of \(z=\langle x,y\rangle\), and we sort of tear out each component.
Join our real-time social learning platform and learn together with your friends!