Ask your own question, for FREE!
Mathematics 13 Online
OpenStudy (thomas5267):

How to calculate the gradient, divergence, curl and Laplacian under arbitrary coordinate systems?

OpenStudy (thomas5267):

@ganeshie8 @Kainui

OpenStudy (bobo-i-bo):

Given a vector function F(x,y,z), if you want to change it to coordinates system (a,b,c) where a(x,y,z) b(x,y,z) c(x,y,z) Then: \[\nabla F= \frac{ \delta F}{ \delta x}+\frac{ \delta F}{ \delta y}+\frac{ \delta F}{ \delta z}\] Then use the multivariable chain rule, for example: \[\frac{\delta F}{\delta x} = \frac{\delta F}{\delta a}\frac{\delta a}{\delta x} +\frac{\delta F}{\delta b}\frac{\delta b}{\delta x} + \frac{\delta F}{\delta c}\frac{\delta c}{\delta x}\] Hopefully that gives you a good enough idea of what to do? I would be more explicit with my stuff but it's too much writing. If you don't get it then I'll go an endeavor to write a fuller explaination. :P

OpenStudy (thomas5267):

Yes I do understand you are saying. Do you do the same with curl, divergence and Laplacian? That's a lot of work...

OpenStudy (kainui):

Have you learned any tensor notation officially yet? In tensor notation you can write these things MUCH easier. gradient components \(\nabla_i f\) divergence \(\nabla_i v^i\) curl components \(\varepsilon^{ijk}\nabla_j v_k\) Laplacian \(\nabla_i\nabla^i f\) Each of these are sorta just arbitrarily written, you can juggle indices with the metric tensor and these covariant derivative operators \(\nabla_i\) are derived from taking the derivative and pulling out the stuff that depends on the basis vectors... Haha. This is probably a bit intense to just throw out on you so I'll show how it's derived real fast. First let's start with a vector \(\vec V\) in an arbitrary basis, so let's just say its basis is \(\vec Z_i\) and its components are then \(V^i\) completely general, so \(\vec V = V^i \vec Z_i\) using Einsten summation notation. Now let's take the derivative with respect to the coordinates \(Z^j\), which the components AND basis vectors depend on, so the product rule will have to be considered for both of 'em. \[\frac{\partial}{\partial Z^j} \vec V = \frac{\partial V^i}{\partial Z^j} \vec Z_i + V^i \frac{\partial \vec Z_i}{\partial Z^j} \] The \(\frac{\partial \vec Z_i}{\partial Z^j} \) are the derivatives of the basis vectors with respect to the coordinates, and so since these are vectors, we can represent them as a linear combination of the basis vectors with components called "Christoffel symbols" \[\frac{\partial \vec Z_i}{\partial Z^j} = \Gamma_{ij}^k \vec Z_k\] So plug this back in: \[\frac{\partial}{\partial Z^j} \vec V = \frac{\partial V^i}{\partial Z^j} \vec Z_i + V^i \Gamma_{ij}^k \vec Z_k \] Because these are just dummy indices of summation, we can exchange the pair of k's with the pair of i's on the term we just fixed so that we can factor out a \(\vec Z_i\) like I mentioned at the very start of all this. \[\frac{\partial}{\partial Z^j} \vec V = \frac{\partial V^i}{\partial Z^j} \vec Z_i + V^k \Gamma_{kj}^i \vec Z_i = \left( \frac{\partial V^i}{\partial Z^j} + V^k \Gamma_{kj}^i \right)\vec Z_i\] It's specifically this chunk that we use to define that guy: \[\frac{\partial}{\partial Z^j} \vec V = \left( \frac{\partial V^i}{\partial Z^j} + V^k \Gamma_{kj}^i \right)\vec Z_i = (\nabla_j V^i ) \vec Z_i\] Why did I walk through all of this you might ask? Well to be honest I couldn't remember if it was a + or - sign (it's - if the components are covariant actually lol) and so I thought I'd just do it here and maybe you'd get something out of it too... BUT this is definitely pretty scary so hopefully it's not TOO bad. And look, if you constrict yourself to coordinate systems where the basis vectors don't change as you move, then your \(\Gamma_{ij}^k=0\) which might simplify things a lot. There's a nicer formula for the Laplacian too if you take the determinant of the metric tensor, I forget the exact form for it.

OpenStudy (thomas5267):

Tensor notation will not be officially teached until next year at the very least. That said, it was surprisingly easy to understand. Am I just mad or the notation of the Laplacian indicates that it is invariant under coordinate transformation?

OpenStudy (kainui):

Ah, so I derived the result where there was an index on it, for when there's no index on it, then \[\nabla_i f = \frac{\partial f}{\partial Z^i}\] Which gives us the components of the gradient, however the divergence of the gradient is the laplacian, in other words now we are looking at replacing \(V_i = \frac{\partial f}{\partial Z^i}\) in the equation I derived above. So, I'll now throw the covariant metric tensor (Literaly the inverse of the covariant metric tensor which is just the dot product of all the basis vectors \(\vec Z_i \cdot \vec Z_j\)) on the laplacian (definition I gave in the very first post at the start) since \(Z^{jk}\nabla_k = \nabla^j\) \[Z^{jk} \nabla_k \nabla_i f = Z^{jk} \nabla_k \left( \frac{\partial f}{\partial Z^i}\right)\] So plugging into the eqn we derived last post (and saying that thing without proof that covariant makes the term negative and I believe I left out another detail about placement of indices, but OK I don't wanna derive every last detail but I can if you're interested it's not much more), while trying my very best not to screw up the indices lol: \[Z^{jk} \left (\frac{\partial^2 f}{\partial Z^i \partial Z^k} - \frac{\partial f}{\partial Z^j} \Gamma_{ik}^j \right)\] So indeed the \(\Gamma\) showing up means we've got some dependence. As long as we stick to affine coordinate systems though, this term is 0. However if you're confined to the surface of a sphere -- well no such luck.

OpenStudy (kainui):

I just now realized I probably should have put the metric tensor on the first thing we differentiated, it turns out the metric tensor is in a sense like a constant since \(\nabla_i Z_{jk} = 0\) (you can derive this from the definition \(Z_i \cdot Z_j = Z_{ij}\)), so \[\nabla^i \nabla_i =\nabla^i(Z_{ij}\nabla^j) = Z_{ij} \nabla^i \nabla^j =(Z_{ij} \nabla^i )\nabla^j = \nabla_j \nabla^j = \nabla_i\nabla^i\] Also despite having that gamma and stuff in there, it actually obeys the product rule lol so here's some motivation. xD \(\nabla_i(ab) = (\nabla_i a) b+a (\nabla_i b)\)

OpenStudy (kainui):

You can probably go though all of this stuff in the same way without the suppressed summation notation and get whatever results you want without using all this sorta super condensed notation since you seem to be following the reasoning more or less. Hahaha anyways I'll stop puking words at you

OpenStudy (thomas5267):

How am I suppose to interpret the \(\nabla_i\nabla^i f\)? If I am not mistaken, the i should be summed over and hence Laplacian is invariant under coordinate changes? That's seems wrong although Laplacian is a scalar...

OpenStudy (thomas5267):

What does it mean to have covariant components? Does that mean the basis vectors corresponding to the covariant components are contravariant?

OpenStudy (kainui):

Ahhh I see what you're saying, yes, the Laplacian is invariant under coordinate change because its' a scalar. All tensors are really invariant under coordinate change, but how you get there might end up looking different, since we can look at some other coordinate system \[\nabla^i \nabla_i f = \nabla^{i'} \nabla_{i'} f\] At every point the laplacian should be the same regardless of what coordinate system you use. Similarly, what if we just look at the gradient? \[\nabla^i f\] Those are just the components, but there is an invariant vector field there if you contract them with the basis vectors: \[(\nabla^i f )\vec Z_i\] This is also an invariant, it shouldn't matter how you look at the flow of wind/water/heat/etc and in tensor notation it surely doesn't thankfully.

OpenStudy (kainui):

Ahh I see I overlooked your other question which is pretty important: "What does it mean to have covariant components? Does that mean the basis vectors corresponding to the covariant components are contravariant?" covariant and contravariant refers to where the index is, up or down. So \(V^i\) is contravariant and \(V_i\) is covariant. The relationship between these objects is the metric tensor, \[Z_{ij} V^i = V_j\] Really, you can think of one being shorthand for the other, it doesn't really give us anything new except some alternative way to calculate stuff, it gives us wiggle room essentially. Covariant things contracted with contravariant tensors ALWAYS gives us invariants. This is one of the most important aspects about tensors. In order to be called a tensor you mus obey this transformation property: \[T^i \frac{\partial Z^{i'}}{\partial Z^i} = T^{i'}\] This is really a bunch of equations suck together, and probably best written as a matrix, however if we look at just one of these, \[u^1 \frac{\partial x^{1'}}{\partial x^1} + u^2 \frac{\partial x^{1'}}{\partial x^2} = u^{1'}\] we can drop the numbers and call \(x^1 =x\), \(x^2 = y\), \(x^{1'} = r\) and \(x^{2'}=\theta\) then I'll use subscripts to denote which component now instead of putting superscripts: \[u_x \frac{\partial r}{\partial x} + u_y \frac{\partial r}{\partial y} = u_r\] Similarly, the covariant tensors change in an oppose way to this, \[T_i \frac{\partial Z^{i}}{\partial Z^{i'}} = T_{i'}\] However I have kinda just puked a lot of stuff at you so it's no doubt entirely indigestible at this point. Oh well, yeah if you want the holes filled in I could tell you but I feel like it'd probably be much more cohesive to just take the course and learn it in some more logical order, I've sort forgotten the order in which things are motivated since I've already proven to myself that things are consistent in one way or another (a lot of proofs in tensor calculus are surprisingly simple) so you can repeat most of them in a pinch. The notation is really great, the two main things I enjoy is how it basically makes matrix multiplication commutative and it takes different vector calculus identity and condenses them down into one formula that usually comes about more naturally. Anywho

OpenStudy (thomas5267):

Can a vector have covariant "coordinates" and contravariant "basis", as in \(V^i\vec{Z}_i + V_k\vec{Z^k}\)? Are there any example of that? The only possible candidate I could think of are special relativity four-vectors. Furthermore, what is a vector in tensor terminology? Does a vector have to be invariant and hence have no indices? However, the term "basis vectors" are frequently used as well and "basis vectors" are certainly covectors.

OpenStudy (thomas5267):

Everything gets conflated and convoluted in an elementary linear algebra course. The vectors in those courses are nx1 or 1xn arrays of coordinates, the basis vectors are also nx1 or 1xn arrays of numbers, everything is two dimensional array of numbers (some more two dimensional than others hehe). The problem is the arrays of coordinates are not vectors but "contravectors", the basis vectors are not vectors but covectors, and two dimensional array of numbers has one contravariant and one covariant component!

OpenStudy (kainui):

"Can a vector have covariant "coordinates" and contravariant "basis", as in \(V^i\vec{Z}_i + V_k\vec{Z^k}\)? Are there any example of that? " Yeah, that would be an example of that, however it's unnecessary and confusing to do so. Each term is a scalar, summed on its own indices, so we can rewrite that as \(V^i\vec{Z}_i + V_i\vec{Z^i}\) if we want to, it's only the terms multiplied together that we're considering the sum on, although we can combine them. If we were to write it out, it'll look like this in 3D: \[ V^i\vec{Z}_i + V_k\vec{Z^k} = \sum_{i=1}^3 V^i\vec{Z}_i + \sum_{k=1}^3 V_k\vec{Z^k} \] exactly the same if we were to write it as: \[ V^i\vec{Z}_i + V_i\vec{Z^i} = \sum_{i=1}^3 V^i\vec{Z}_i + \sum_{i=1}^3 V_i\vec{Z^i} \] Since each summation is to itself, although this is less than ideal because you're mixing and matching for no real reason. The contravariant and covariant components don't contain more or less information, it just might appear different. The difference is the same kind of superficiality as choice of coordinates, once you contract away indices. I will explain more in this next bit. We can use the fact that the covariantant and contravariant metric tensors are inverses of each other, and introduce the identity matrix (Kronecker delta) between the to terms to flip them around just as well. I'll use parenthesis just to show more clearly what I'm looking at, and \(V_k \vec Z^k= V_\ell \delta_k^\ell \vec Z^k = V_\ell (Z^{\ell j}Z_{j k} ) \vec Z^k=(V_\ell Z^{\ell j})(Z_{j k} \vec Z^k)=V^j \vec Z_j=V^k\vec Z_k\) So now the point of this being, your example we can write without loss of generality: \[ V^i\vec{Z}_i + V_k\vec{Z^k} = V^i\vec{Z}_i + V^i\vec Z_i= \sum_{i=1}^3 V^i\vec{Z}_i + \sum_{i=1}^3 V^i\vec{Z}_i = 2\sum_{i=1}^3 V^i\vec{Z}_i \] "The only possible candidate I could think of are special relativity four-vectors." That's due to something else, it's because of the form of metric tensor in relativity. The components of 4-vectors are either ALL covariant or ALL contravariant depending on what representation you're looking at, but each way is related by the metric tensor like discussed above. "Furthermore, what is a vector in tensor terminology? Does a vector have to be invariant and hence have no indices? However, the term "basis vectors" are frequently used as well and "basis vectors" are certainly covectors." This is unfortunately a hard question to answer, since in general if you are considering curved spaces you run into troubles defining what a vector is and what a basis vector is. A vector without indices is like a scalar in a sense. Anything without indices is invariant if it's constructed appropriately, so like a a point in space has a wind velocity, that's some vector. But since space can be curved, they exist in something called the tangent space. At each point in space we assign a flat vector space called the tangent space, and it's constructed by looking at how the curvilinear coordinates change around that point. In fact, that's what the basis vectors are, they're really like operators on your manifold that tell you (intrinsically) how the space is changing... So if you wanna move stuff around you gotta do it by hopping it from one tangent space to the next, and so you need integrals to move stuff around with something called a connection (the \(\Gamma\) symbols from earlier actually) and this is called parallel transport if you do it along some special lines called geodesics. I think that's a bit ahead of what you can handle right now but I'm just trying to get you a feel for some of the main stuff that happens.

OpenStudy (kainui):

"Everything gets conflated and convoluted in an elementary linear algebra course. The vectors in those courses are nx1 or 1xn arrays of coordinates, the basis vectors are also nx1 or 1xn arrays of numbers, everything is two dimensional array of numbers (some more two dimensional than others hehe). The problem is the arrays of coordinates are not vectors but "contravectors", the basis vectors are not vectors but covectors, and two dimensional array of numbers has one contravariant and one covariant component!" Well this isn't entirely true, for the most part linear algebra without thinking about tensors is just tensor calculus in flat space. In flat space your metric tensor becomes the identity matrix so it doesn't matter and you often see \(V^i = V_i\) because they're the same. They will call these vectors, but really they are the components of vectors, which you can then expand in a basis, for example like this. \[\vec V = V^1 \vec e_1 + V^2 \vec e_2\] But upper and lower indices realy are interchangeable at this level so it doesn't really matter you can make them all lower or whatever.

OpenStudy (thomas5267):

So we can have vectors with both contravariant and covariant components but it is entirely stupid to do so because we can always contract with the metric tensor to make it all contravariant or covariant? What has the basis vectors to do with the tangent space? Are the basis vectors the basis of the tangent space? I looked up "Parallel Transport" on Wikipedia and I do not like what I saw. I saw the word "tangent bundle" and "vector bundle". I looked up fiber bundle a day or two ago and I feel like I need to learn topology before proceeding... The problem I have with the way they teach elementary linear algebra course is that I have to unlearn a lot of things before learning tensor calculus, which I think is a part of advanced linear algebra because tensors are multilinear maps. Learning is hard and unlearning is even harder. There is pretty much no point in introducing an completely specific notation (indices that only work on flat spaces) early on only to change it later.

OpenStudy (kainui):

Nah you shouldn't feel that way at all, linear algebra is fundamental for understanding tensors. Tangent spaces ARE the flat spaces from linear algebra. Multilinear maps are just multiple linear maps, so if you don't know what a linear map is you'll have a tough time learning multiple linear maps stuck together at the same time haha. An example of a multilinear map is the determinant, since it's linear in each of its columns, so for instance a 2x2 matrix determinant: \[\det(\vec v ,a\vec u+b\vec w) = a\det(\vec v, \vec u) +b \det( \vec v, \vec w)\] But it's multilinear so you could do this in the first column as well. Overall tensor notation is not really anything fundamentally different, tensor notation is mostly just throwing away summation signs on matrix products. Linear algebra and calculus are fundamental, and vector calculus and tensors are where the two subjects become one.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!