Ask your own question, for FREE!
Physics 8 Online
OpenStudy (theeric):

In thermodynamics, I have to use the partition function to find \(\bar\varepsilon\). I'm not comfortable with this material though, so any help would be appreciated! I know that \(\bar E=N\bar\varepsilon\) So I guess \(\bar\varepsilon\) is the average energy of one particle and \(\bar E\) is the average energy of the system of particles. I think this is for 1D space. The partition function we were talking about for that is \(Z_1=V\dfrac{2\pi mK_BT}{h^2}\)

OpenStudy (anonymous):

That appears to be the partition function for a single particle confined to move in a plane. If memory serves. Do you have any information about the system you're talking about? If it were 1-D, then everything except V would be inside a square root.

OpenStudy (theeric):

Thanks! I don't think there was a square root on the board, and I don't think I would've missed that.. It's more likely I missed it than the professor, though... I can't find that equation for \(Z_1\) in my book. Does the subscript \(1\) indicate one dimension? Because I'm sure that it was \(Z_1\) the whole time.

OpenStudy (anonymous):

No, it indicates a single particle.

OpenStudy (theeric):

Oh! Okay, thanks.

OpenStudy (anonymous):

Can you define the system you're trying to describe? Like, an ideal gas in 3D, or something like that

OpenStudy (theeric):

Looking at my notes, that partition function is for an ideal gas, single particle in a 3D box. The questions he gave us were to find certain values.

OpenStudy (theeric):

I still can't find it in my book, sorry.

OpenStudy (theeric):

We found it with \(Z=I^3\), but I don't know what that means. That lead to a Guassian integral, and I don't know what that is. This lecture was hard to follow. My professor has a thick accent, and I'm weak in thermodynamics.

OpenStudy (anonymous):

In general, the partition function is \[ Z = \sum_{\text{states}} e^{-\epsilon/k_BT}\] where \(\epsilon\) is the energy of each state. You can also define \( \beta \equiv \frac{1}{k_B T}\) to make your calculations easier - in which case you have \[ Z = \sum_{\text{states}} e^{-\beta \epsilon} \] The probability that your system is in state \(i\) is \[ P_i = \frac{e^{-\beta \epsilon_i}}{Z} \] Do you follow so far?

OpenStudy (theeric):

So is \(\epsilon\) the \(\varepsilon\) my professor asked for? And is it the energy of the particle? Thank you! :)

OpenStudy (anonymous):

No.

OpenStudy (anonymous):

This is gonna require your full attention, so please don't wander off while we're doing this or it's going to take forever and possibly not be helpful. A system (in your particular case, a single particle, but that is certainly not a requirement) can exist in a large number of different states. Each state is defined by a set of characteristics - namely, the location and momentum of each constituent of the system. In your case, with the single particle, each state is defined by the particle's position in space and by its momentum (vector). Do you understand so far?

OpenStudy (theeric):

And my class does use \(\beta\) a lot! I see what you wrote before, and I think I understand a little. There's also a little in my book on this, but I don't see it in the partition function. My book gives the case of harmonic oscillator, where the energy levels are given by \(\epsilon_n=\left(n+\frac12\right)\hbar\omega\) So I see what you mean about the states. All this goes together in your last post, so thanks! And I'll try to pay more attention... I made the mistake of committing to another problem.. But this is my main focus. Accepting that \[\large Z = \sum_{\text{states}} e^{-\epsilon/k_BT}\]the rest makes some sense. Should we use \(\epsilon_i\) in that probablility equation, to account for the energy of different states? So \(\varepsilon\) would be the mean energy of the particle, considering all its states and their respective probabilities?

OpenStudy (anonymous):

The biggest trouble with thermodynamics is just hastily applying procedures when you don't actually understand them, so we're going from the ground up. Do you understand what I mean by "state" of the particle? The state of a particle is specified by its position and its momentum.

OpenStudy (theeric):

Thanks! And yep! The state is a set of values that wholly describe the particle. I think momentum and position is one, and I thought there was another way to look at it. But I know the Heisenberg Uncertainty Principle uses momentum and position.

OpenStudy (anonymous):

No, we're just sticking to position and momentum. Okay. So, the partition function is the sum over all possible states of the function \( e^{-\beta \epsilon} \), where \(\epsilon\) is the energy of the state. So now we ask- what is the average energy of the system? And the answer is: \[ \overline \epsilon = \frac{\sum_{\text{states}} \epsilon \cdot e^{-\beta \epsilon}}{Z}\] Which should be fairly clear - the average energy is equal to the sum of the energies of each state weighted by the probability that the system is in that state to begin with.

OpenStudy (anonymous):

But notice that the numerator of that is simply \[ -\frac{d}{d \beta} Z \] So we can write \[ \overline \epsilon = -\frac{dZ}{d\beta} / Z \] Which is also equal to \[ \overline \epsilon = -\frac{d}{d\beta} \ln(Z) \] Okay?

OpenStudy (theeric):

Cool! I see that, thanks. \(\overline \epsilon = \frac{\sum_{\text{states}} \epsilon \cdot e^{-\beta \epsilon}}{Z}\) makes complete sense because of the probability. I'm looking at the next part now :)

OpenStudy (anonymous):

So the answer seems simple - we need only calculate Z and we will be able to find the average energy by simple differentiation. But it is the calculation of Z that can be quite subtle, so this is the part that I need to explain to you.

OpenStudy (theeric):

I would've never seen that... But I know I need more practice in recognizing derivatives (and integrals). And I don't think I know that natural log relation, but I will try to remember that, thanks! Okay, I'm following.

OpenStudy (anonymous):

It's the chain rule. But okay. So we need to find a way to sum over all possible states. In quantum mechanics, the states are often discrete, so this is easy. In classical thermodynamics, however, the states are not discrete - they are continuous. This complicates the process significantly, but it can be overcome. First, I need to define "phase space". Phase space is the space you get when you plot position and momentum together. In 1D, you'd plot position on the horizontal axis and momentum on the vertical axis. A point (x,y) corresponds to a particular location and a particular momentum - therefore specifying the state of the particle. In more than 1-D, we cannot visualize phase space because it would be more than 3D. For example, in 2D we would need two momentum directions and two position directions, yielding a 4D plot. Likewise, in three dimensions, we would need a 6D phase space. It cannot be visualized, but it can be imagined - a point in phase space specifies the state of a particle.

OpenStudy (theeric):

If it helps, we're studying semi-classical physics.

OpenStudy (theeric):

In this section, I mean.

OpenStudy (anonymous):

But we cannot add up points, because there are an uncountable infinity of them. Instead, we need to break up phase space into "cells" - little squares, like a grid. We can then count each square as a separate state.

OpenStudy (anonymous):

It doesn't matter. This is necessary, and important.

OpenStudy (theeric):

I understand the phase space, thank you! My professor never mentioned it, but I will follow! This approach made sense. I think I read about it in my book recently. My professor didn't mention that, either, but I like to know these things. So this is an arbitrary approximation that is really exact, just like the limit of the Riemann sum for integration? The infinitessimal values that add up to be exact? Just my guesses. If not, please continue. Also, thanks for helping. This takes time, so thanks. And if you want to stop for any reason, it's perfectly fine.

OpenStudy (anonymous):

It will turn out that exactly what size we make our squares is completely irrelevant, but we'll save that for a moment. Imagine we're working in 1D, so we have a 2D phase space. What should the area of each of our squares be? It would have to be some small quantity with units of position times momentum - a natural choice is Planck's constant \( h\). Let's see how we should turn the summation into an integral (for continuous position and momentum). The total number of states accessible to the particle in the discrete case is just \[\sum_{\text{states}} 1 = N_{\text{states}} \] When we integrate, we need to integrate over all position and momenta and then divide by the area of each square: \[\frac{1}{h} \int dx \int dp \space 1 = N_{\text{states}} \] So apparently we need to replace \( \sum_{\text{states}} \) with \(\frac{1}{h} \int dx \int dp \). This is the procedure. Do you understand?

OpenStudy (anonymous):

In 3D, by the way, it would simply become \[ \sum_{\text{states}} \rightarrow \frac{1}{h^3} \int d^3x \int d^3p \]

OpenStudy (theeric):

I understand somewhat. The summation of 1 over all states is the number of states, that makes complete sense. I'm looking at the integrals. I'm trying to see why we divide by \(h\). Is the number of states going to be like the number of these cells?

OpenStudy (anonymous):

Yes, that's the point. We cannot consider each point a separate state, because then it would be impossible to count them. We need to break it up into a discrete grid.

OpenStudy (theeric):

Okay, thanks for the 3D version! I think I understand then, and connected some dots :)

OpenStudy (anonymous):

Another rationale, if you'd like, is that Z must obviously be dimensionless, but without dividing by h we would have a summation that inherently has units of length times momentum.

OpenStudy (theeric):

That makes sense. I've learned to check units whenever I go into theory. Not well enough, but I know how useful it is.

OpenStudy (theeric):

:)

OpenStudy (anonymous):

Okay, so now that we understand that part, we can calculate the single-particle partition function for an ideal gas in 3D. \[ Z_1 = \frac{1}{h^3} \int d^x \int d^3p \space e^{-\beta \epsilon} \] In classical mechanics, we know that \( \epsilon = p^2/2m = p_x^2 / 2m + p_y^2/2m + p_z^2/2m\). Therefore, we have that \[ Z = \frac{1}{h^3} \int d^3x \int dp_x dp_y dp_z e^{-\beta p_x^2/2m}e^{-\beta p_y^2/2m}e^{-\beta p_z^2/2m}\]

OpenStudy (anonymous):

Firstly, it's clear that nothing depends on the position at all, so we can simply perform the spatial integral to obtain the total volume of our system \(V\). Secondly, it's obvious that the three momentum integrals are exactly the same, so we can simplify this to \[ Z = \frac{V}{h^3} \left( \int dp e^{-\beta p^2/2m} \right)^3 \]

OpenStudy (theeric):

Okay.

OpenStudy (anonymous):

This is where the Gaussian integral comes into play. As it happens, \[ \int_{-\infty}^\infty dx \space e^{-x^2} = \sqrt{\pi}\] Based on this and some simple substitution, you can derive the result \[ \int_{-\infty}^\infty dx \space e^{-a x^2} = \sqrt{\frac{\pi}{a}} \] That's all we need to perform the integral we arrived at: \[ \int dp \space e^{-\beta p^2/2m} = \sqrt{\frac{2m\pi }{\beta} } \] This gives us the final result: \[ Z_1 = \frac{V}{h^3} \left( \sqrt{\frac{ 2m \pi}{ \beta }}\right)^2 = \frac{V}{h^3} \left(\frac{ 2m\pi }{\beta} \right)^{3/2} \] We can even do a bit of rearranging to get \[ Z_1 = \left( \frac{ 2\pi m V^{2/3}}{\beta h^2 } \right) ^{3/2} =\left( \frac{ 2\pi m k_B T \cdot V^{2/3}}{ h^2 } \right) ^{3/2} \] But that's just some algebraic manipulation.

OpenStudy (anonymous):

I'll keep going - you can digest this and then ask questions at the end.

OpenStudy (theeric):

Haha, I follow! I'm impressed and thankful. I even caught a typo. \(Z_1 = \frac{V}{h^3} \left( \sqrt{\frac{ 2m \pi}{ \beta }}\right)^\color{blue}2 = \frac{V}{h^3} \left(\frac{ 2m\pi }{\beta} \right)^{3/2}\) The blue 2 would be 3, I think. I'm just saying this so that, if you see it, you don't need to say anything about it.

OpenStudy (anonymous):

Now I want to find the average energy, so I just take the log of this whole mess and then differentiate with respect to \( \beta\). In fact, this becomes enormously simple due to the properties of the logarithm. Note that \[ \ln(Z_1) = \ln \left[ \left(\frac{2\pi m V^{2/3}}{\beta h^2} \right)^{3/2} \right] = \frac{3}{2} \ln \left(\frac{2\pi m V^{2/3}}{\beta h^2} \right) \] \[= \frac{-3}{2}\ln(\beta) + (\text{ a bunch of stuff that does not depend on }\beta)\] Yes, you're right, good catch.

OpenStudy (anonymous):

So we finally arrive at our result: \[ \bar \epsilon = -\frac{d}{d\beta} \ln(Z_1) = \frac{3}{2\beta} = \frac{3}{2} k_B T \] Which, if you recall, I mentioned to you on a thread earlier today.

OpenStudy (theeric):

Okay!

OpenStudy (theeric):

Haha, thank you! That's funny. It's "full circle."

OpenStudy (theeric):

So that demonstrates that the partition function that we used was derived considering a monotonic ideal gas, right? Because we didn't make that assumption through the rest of this process.

OpenStudy (anonymous):

So, we have a few things to recap. Procedure: Replace summation with the appropriate integrals. Plug in the appropriate energy, which in general may depend on position and momentum. Perform the integrations, which will give you your partition function. Differentiate the partition function (or the log of your partition function) as appropriate to get your average energy. Notice how the \(h^3\) plays no role whatsoever. So, if we had used \( h, \hbar, h/2, 25h, ...\) it would not matter at all. h is just a convenient choice, which also happens to agree with the fully quantum mechanical treatment using the infinite square well potential. Finally, notice that the 3 in our final answer can be traced back to having three identical momentum integrals - a consequence of occupying 3D space. At the same time, the 2 is a result of the quadratic relationship between energy and momentum. If you had asked me instead "what is the average energy of particles restricted to a plane", the 3 would become a 2 and the result would simply be \[ \bar \epsilon = k_B T\]

OpenStudy (anonymous):

As a footnote - classical statistical mechanics (which is what this discipline is really called) is applicable as long as the number of particles occupying each individual state is, on average, very very very small. Once you start having a macroscopic number of particles per state, quantum effects become important and you cannot rely on the classical methods anymore.

OpenStudy (theeric):

Awesome recap, thank you very much.

OpenStudy (anonymous):

To answer your question - the number of dimensions your system is occupying is used when you replace the summation by the appropriate integrals. Other than that, our process is completely general all the way up until we wrote \[ \epsilon = p^2/2m \] which is true only for an ideal gas.

OpenStudy (theeric):

That's interesting about the effects becoming important with the macroscopic number (on the order of Avagadro's number, right?) of particles occupying that state... And that's right! You told me that earlier today! And now it's really full circle.

OpenStudy (theeric):

Thank you very much!

OpenStudy (theeric):

That's the only kind of energy an ideal gas has :)

OpenStudy (anonymous):

No, on the order of 1 or 2 or 3. There are a VAST number of possible states to be occupied.

OpenStudy (theeric):

Energy a particle of an ideal gas has, I mean..

OpenStudy (theeric):

So by macroscopic number of particles per state, you mean 1, 2, or 3 per state. So, the reference to "macroscopic" indicates that having 1, 2, or 3 particles of the same state in a system would mean that it's probably macroscopic?

OpenStudy (anonymous):

No, not really. For instance, the Pauli exclusion principle is a quantum effect that prohibits fermions (like electrons) from occupying the exact same state at the same time. If the probability that two electrons are in the same state is exceedingly low, then you can safely ignore this and continue on in the classical regime. However, if the density of particles becomes such that there is a reasonable chance to have more than one particle per state, then you obviously have to take quantum effects into account.

OpenStudy (theeric):

Okay. Neat! :) My book mentions Fermi statistics. Is Fermi statistics designed for the Pauli exclusion principle, then?

OpenStudy (theeric):

Neat stuff.

OpenStudy (anonymous):

Yes. Fermi statistics is the proper way to treat systems of fermions, and differs from the classical approach because of the pauli exclusion principle.

OpenStudy (theeric):

Cool, thank you very much. Take care!

OpenStudy (anonymous):

No problem.

Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!
Can't find your answer? Make a FREE account and ask your own questions, OR help others and earn volunteer hours!

Join our real-time social learning platform and learn together with your friends!