Problem set 2, Statistical Mechanics, Chem 358, due Friday, October 17

  1. Mathcad simulation question based on Notes “Analyzing a simple model in Statistical Mechanics”.

In the next two questions we will perform the simulation experiments I discussed in class, and in the first set of lecture notes to some extent. The first experiment is a“diffusion experiment”. Put a number of balls in a cyclically arranged set of N boxes, such that box N connects to box 1. Starting from an initial distribution we randomly move the balls around. We will find that the balls will get evenly distributed and that entropy tends to increase, until it reaches a maximum and simply fluctuates. In the second problem we arrange the boxes in a linear fashion, such that there is a first and a last box. Then we randomize by picking two balls and randomly move one up, and one down, provided the move is allowed. Again entropy increases, it will reach a maximum and fluctuate. The balls will now be distributed according to a Boltzmann distribution , for box 1 .. N. This problem is the same as equilibrating energy levels, while preserving the total energy of the ensemble.

1. Simulations and Entropy. I. Diffusion.

Take a sample of M “balls”, and N boxes. Use M=1000 and N=6 for example. As our simulation proceeds, we will assign to every element of the sample, a particular box . Initially put all of the balls in one box (or a few boxes), so we are far from equilibrium. From the sample we can count how many balls are in each box. This defines the populations, and we can associate frequencies , and an entropy function . As we reach equilibrium we expect each of the frequencies to approach . A random move consists (in analogy with the example below), of randomly choosing any two entries i and jin the sample, and moving the first one box up, and the second one box down. In addition we use the cyclic condition: Moving ‘up’ from box N puts you in box 1. Moving down from box 1 puts the ball in box N. At every step of the simulation you should keep track of the populations , where s indicates the simulation step. Of course the most economical way to do this is to keep track of the changes to the populations directly, as defined by moving a pair of balls up and down, rather than counting the number of balls in each box. From the populations we can calculate the following functions, which are all a function of the simulation step:

Questions.

a) Prepare your initial sample by putting all of the balls randomly in boxes 1...3

b) Now make a large number of random moves (on the order of a million perhaps), and collect the population vectors . Plot the population vectors themselves.

c) Calculate , as a function of the simulation step, and discuss your results.

d) Perform a step averaging over a 1000 simulation steps (starting from your equilibrated sample) and calculate the average value of , and . Also monitor the maximum value of S, and indicate the population vector(s) for which it reaches this maximum. (You might repeat the calculation for M=1002. Explain what is happening …)

e) Derive analytically the most likely value of the population vector. Also maximize S under suitable constraints and show that this yields the equal probability distribution. Provide analytical values for

Problem 2. Simulations and entropy. II. Boltzmann distributions.

The setup in this problem is very similar to problem 2. Take a sample of M elements (e.g. M=1000), distributed over N boxes (e.g. N=6). Initially randomly distribute the elements (see below). We define populations , frequencies and the functions as before. The only difference is the definition of a random move, or simulation step. In this problem we define a move as: randomly pick an element i and j from the sample. Increase the box number of i by 1, and decrease the box number of j by 1, if this is possible. If or , the move is impossible, and you should randomly pick another pair of elements. Let us define a move, such that one has to pick a valid pair i,j. Because of this definition of the move, you will see that cannot change by a random move; it is determined by the initial distribution. This reflects conservation of energy, if we associate box n with energy level n. Another (related) difference with the first problem is that we cannot go around and move a ball from box N to box 1 (again violation of energy).

Questions:

a) Prepare your initial sample by putting all of the balls randomly in boxes 1...3

b) Now make a large number of random moves (on the order of a million perhaps), and collect the population vectors , where s indicates the simulation step. Plot the population vectors themselves.

c) Calculate , as a function of the simulation step, and discuss your results.

d) Perform a step averaging over a 1000 simulation steps (starting from your equilibrated sample) and calculate the average value of , and . Also monitor the maximum value of S, and indicate the population vector(s) for which it reaches this maximum.

e) From the averaged population vector , define relative probabilities . You can fit this against a distribution function . Determine the fit parameter. The quality of the fit is a “proof” that indeed we are finding a Boltzmann distribution (hopefully!).

f) By defining , we can solve for from the relation

. Solve for in this fashion and compare the result to e).

g) We can repeat the experiment as we increase the average in the system. For example apply a small number of moves (e.g. 10) where you randomly choose element i and increaseby 1 (as long as it is smaller than the maximum level N). This will reduce , leading to a more equal distribution. Moreover the average value of and will increase. This is related to the heat capacity of the system. You can repeat things, increasing repeatedly.

h) Let us also calculate the Boltzmann distribution analytically for this example. Maximize under suitable constraints and show that this yields the Boltzmann probability distribution. Provide analytical values for as a function of . Plot your results for S and as a function of .

  1. Formal derivation questions.

3)

Follow the lecture notes on “Ensembles in statistical mechanics” and carry out in detail all the steps in the derivation for the grand canonical partition function, in which the energy and the number of particles can fluctuate but the volume is identical for every system in the ensemble. The purpose of this exercise is to make sure you understand each step in the derivation.

4)

Let us do it one more time, now for an ensemble that is not treated explicitly in the notes. Consider an ensemble in which each system has the same energy U and number of particles N, but a variable volume, . Derive the partition function for this problem and derive the characteristic thermodynamic function, using that as usual. The natural variables for this function in statistical mechanics are U, p and N. What are the natural variables commonly used for this thermodynamic function? Can you derive the latter form from the statistical mechanical form?

5) Solve problem 5.4 in Metiu (use Mathcad once again).

6) Use both the sum over states and the sum over energy levels expression for the electronic (internal energy) partition function

and, starting from derive formal formuals for entropy S, energy U and the constant volume heat capacity , expressed in terms of the probabilities, energies and, in the case of , the energy fluctuation , see last page of the “chapter 6 lecture notes”.