3.9 Entropy and thermodynamic probability

 

The physical meaning of entropy is of great interest. The greatest credit for elucidating this problem must be given to L. Boltzmann, who found that the magnitude of the entropy of a substance in a given state and the thermodynamic probability of this state are related unambiguously. Let us dwell at some length on this problem, beginning with a brief description of the concepts of mathematical probability and thermodynamic probability of state.

Mathematical probability, or mathematical expectation, is denned as the ratio of the number of favorable events to the number of equally possible events. Let us consider the following example. Assume that there are 20 balls in a box, 10 black and 10 red. Also assume that the probability of drawing from the box any one of the 20 balls is the same. In the terms of probability theory this means that by drawing one ball from the box we have at our disposal 20 equally likely events. The question is: What is the mathematical probability that a red ball will be drawn from the box? Since out of the 20 balls kept in the box 10 are red and since having drawn any of the red balls the initial requirement is fulfilled, it is clear that we are confronted with 10 equally possible events. Then, according to the definition of mathematical probability, the probability that a red ball will be drawn from the box is

 

                                                            

 

It should be noted that the mathematical probability is always a proper fraction, since the number of favorable events is always less than the total number of equally possible events. If the number of favorable events is equal to the number of the equally possible events, the event is said to be certain and characterized by Wr = 1.

Let us examine further the example with the box and balls. Assume that of the 10 red balls 5 balls have a band. Now the following question arises: What is the mathematical possibility of a red ball with a band being drawn from the box containing the balls?

This rather simple problem can be solved in different ways. It is possible, firstly, to determine the number of favorable events and, thereby, the unknown mathematical probability. The number of equally possible events is evidently equal, as above, to the total number of balls in the box, i.e. to 20, and the number of favorable events is equal to the number of red balls with a band, i.e. to 5. Then the probability of drawing a red ball with a band from the box is

 

                                                          

 

But another method of solution is also possible. First, the possibility of drawing a red ball from the box is determined. This problem was already solved:

 

                                                            

 

Further, we determine the probability that the ball drawn from the box has a band, with the number of equally possible events being equal to the number of red balls, i.e. to 10, and the number of favorable events being equal to the number of red balls with a band, i.e. to 5. Consequently, the mathematical probability that the red ball drawn from box will happen to have a band is

 

                                                            

 

Multiplying now the mathematical probability Wr and Wb, we can obtain the unknown mathematical probability Wr.b:

 

                                                                

 

The example considered corroborates the general law of probability, according to which the mathematical probability of a complex event (in the example studied, the probability of a red ball with a band being drawn from the box) is equal to the product of the mathematical probabilities of the simple events (the probability of drawing a red ball from the box and the probability that the red ball drawn from the box will happen to be one of the red balls with a band) from which the complex event is composed.

The theory of probability permits the solution of many interesting problems. For illustration consider the following problem. Three riflemen A, B, and C fire a shot simultaneously. The marksmanship of the rifleman A is characterized on an average by four hits out of five shots, the marksmanship of the rifleman B by three hits out of four shots, and the marksmanship of the rifleman C by two hits out of three shots. Determine the mathematical probability of the riflemen A and B making hits, and of C failing to hit.

First assume that the result of the shooting is not known. Determine, departing from this condition, the probability of the riflemen B and C making hits and the probability of the rifleman C failing to hit. On the basis of the conditions given and all that was said above it is clear that the mathematical probability of such a complex event must be equal to

 

                                                       

 

The mathematical probability that riflemen A and C will hit the mark and that the rifleman B will miss the mark will be

 

                                                       

 

And finally, the mathematical probability of the event in which the riflemen A and B will hit the mark and the rifleman C will miss it is

 

                                                       

 

Make use now of the initial condition, according to which after the salvo of the three riflemen A, B, and C only two punctures are detected in the target. From this condition it follows that the failure of one of the three riflemen to hit the mark is a certainty and, consequently, the event of all three riflemen hitting the mark is precluded. Denote by W'c the mathematical probability of the riflemen A and B hitting the mark and the rifleman C fails to hit, on condition that the missing of one of the three riflemen is a certainty. The mathematical probability W'c, which is the aim of the problem, will be as many times greater than Wc as unity is larger than the sum (WA + WB + Wc), since the failing of one rifleman to hit is a certainty and the mathematical expectation of certainty is unity. Hence, the following is true:

 

                                                      

 

whence

 

                                    

 

Consider now an example directly associated with thermodynamics. Assume that a vessel, mentally divided into two equal parts, accommodates a molecule which is in random motion (Fig. 3.19). What is the mathematical probability that at the given moment of time the molecule will be positioned in the left-hand half of the vessel? Since the left- and right-hand parts of the vessel are of equal volume, the residence of the molecule at any point in the vessel is equally possible, and the mathematical expectation

 

                                                                  

 

 

 

 

Fig. 3.19

 

If the vessel contained two molecules, instead of one, the mathematical probability of the event in which both molecules will be positioned in the same, for instance, the left-hand half of the vessel would be equal to

 

                                                         

 

By analogy, for three molecules we would have W3 = (1/2)3, for four molecules, W4 = (1/2)4. If there are N molecules in the vessel the mathematical probability of all the molecules being located in one half of the vessel is WN = (1/2)N. Since N is always a very large number, the mathematical probability WN will be negligibly small[1] and, consequently, the probability of a spontaneous increase in pressure in one of the two halves of the vessel is very low.

The last example is very instructive. As was mentioned in Sec. 3.3, one of the general statements of the second law of thermodynamics runs as follows: spontaneous processes are irreversible. From this statement it follows that processes that proceed “by themselves”, and among them the diffusion of gases, the flow of heat from a body with a higher temperature to a body with a lower temperature at a finite temperature difference, the expansion of a gas without the performance of external work, etc., are irreversible processes. In fact, it is common knowledge that the process of separation of a gas mixture (the process reverse to diffusion) never proceeds “by itself”, i.e. the process never develops without supplementary, compensating processes; it is highly improbable that air confined in a vessel would, due to some event, spontaneously separate into nitrogen and oxygen. Processes of spontaneous heat flow from a body at a lower temperature to a body at a higher temperature or spontaneous compression of a gas also seem improbable.

It was shown above that an irreversible process proceeding in an isolated system results in an increase in the system's entropy. If it proved possible to reverse an irreversible process, entropy would have diminished: but, as is known, this is practically impossible.

The mathematical probability of a spontaneous process proceeding in the direction of decreasing entropy of the isolated system considered is so small that in practice processes of this kind never realize "by themselves". This was made sure by determining the mathematical probability of the spontaneous compression of a gas.

Thus, an assumption can be suggested by which entropy and probability are interrelated. Further on we shall not only be convinced in the validity of such an assumption but shall also determine the nature of this bond.

Before becoming concerned with that problem, however, we should become familiar with the concepts of the macroscopic and microscopic states and of the thermodynamic probability of state.

The macroscopic state of a system or the macro state is determined by the thermodynamic properties, or parameters, of the system: pressure, temperature, specific volume, internal energy, etc. Since to determine all properties of a system consisting of a pure substance it is sufficient to know, in principle, any two of the properties, the macroscopic state of a system is fully deter­mined by any two thermodynamic properties, v and u, for instance. Consequently, speaking about the thermodynamic state of a system, or simply the state of a system, we had in mind the macro state.

The microscopic state of a system or the microstate is determined by the totality of properties determining the state of each molecule of the system: velocity, position, etc. It would, consequently, be wrong to conceive the microstate as the state of only one molecule. It must be stated once more that the microstate is determined by the totality of properties of all molecules of a system.

It can be easily shown that for one and the same macro state of a system there may be a rather large number of microstates. Consider a simple example. Let there be available a system which is a gas confined in a vessel of constant volume. Assume that, as was stated above, the macro state of the system is determined by its volume v and internal energy u. But the constancy of the internal energy of the system does not specify as yet the distribution of energy among individual molecules, i.e. a definite microstate. Indeed, for the given macro state there may exist a microstate in which all molecules of the system have the same amount of energy equal to the internal energy of the system. But for the same macro state there may also exist other micro-states. It can be assumed, for instance, that one half of the molecules possesses twice as much energy as the other half; but if all molecules are intermixed properly and their total energy is equal, as before, to the internal energy of the system, then this new microstate will correspond to the same macro state.

Thus, proceeding only from the distribution of energy between individual molecules, one and the same macro state can be shown to correspond to an enormous number of microstates, bearing in mind that the difference between microstates is not always due to the different distribution of energy among the molecules. The difference between the microstates can also be traced to other factors, for instance, to the distribution of molecules in space and also to the difference in the velocities of molecules with respect to magnitude and direction. It should also be noted that the invariability of a macro state determines in no way the invariability of a microstate. As a result of the chaotic motion of molecules and the continuous collisions between them, for each moment of time there is a definite distribution of energy among the molecules and, consequently, a definite microstate. And since not one of the microstates has any advantages over another microstate, a continuous change-of microstates takes place. In principle, of course, it is possible that a microstate corresponding to a new macro state different from the preceding one may set in. For instance, a case is possible (at least in principle) when molecules of greater energies concentrate in one half of the vessel, and molecules with lower energies concentrate in the other half of the vessel. As a result we would have a new macro state in which a fraction of the gas would be at a higher temperature than the other.

It should not be thought that as a result of the continuous change of microstates a system (for instance, a gas in a vessel) must necessarily undergo a change in microstates. One of the microstates usually has a rather large number of microstates which realize exactly this macro state. It would, therefore, seem to an outside observer having an opportunity to determine the change of only thermodynamic properties that the state of the system does not change.

Let us now turn to the concept of the thermodynamic probability of the state of a system. The term thermodynamic probability or statistical weight of a macro state, is the name given to the number of microstates corresponding to a given macro state.

As distinguished from mathematical probability, which is always expressed by a proper fraction, the thermodynamic probability is expressed by a whole, usually very large, number.

If an isolated system undergoes a spontaneous process resulting in a change of the macro state of the system, this indicates that the new macro state has a greater number of corresponding microstates than preceding macro state. It is, therefore, clear that a spontaneous process results in an increase in the thermodynamic probability of a system. It was from this point of view that Boltzmann gave his statement of the second law of thermodynamics:

Nature tends from less probable states to more probable states.

The increase of the entropy of an isolated system undergoing a spontaneous irreversible process and the simultaneous increase of the thermodynamic probability of state of the system grounds to believing that entropy and thermodynamic probability are interrelated:

 

                                                                

 

To determine the relation between entropy and thermodynamic probability, proceed as follows.

Let there be two systems with entropies S1 and S2 and thermodynamic probabilities W1 and W2, respectively. Let the two systems form a new system with entropy S and thermodynamic probability W. Entropy, just as all caloric properties, possesses the property of additivity, whence it follows that

 

                                                                                                                         (3.161)

 

But the thermodynamic probability of the total system is

 

                                                                                                                           (3.162)

 

Equation (3.162) follows from the fact that each microstate of one of the component systems forms together with any microstate of the other system a microstate of the total system. The number of possible microstates of the total system, forming one and the same macro state of that system, i.e. its thermodynamic probability, will be equal to the number of all possible combinations, i.e. products of the thermodynamic probabilities of the two systems.

Since one and the same functional dependence relates the entropy of each system with its thermodynamic probability: S1 = φ (W1), S2 = φ (W2), S = φ (W), the following equation is true:

 

                                                                                                   (3.163)

 

Differentiating Eq. (3.163) with respect to W1, we obtain:

 

                                                       

 

and differentiating with respect to W2, we get:

 

                                                     

 

or, taking into account that W1W2 = W,

 

                                                            

 

We have derived a differential equation of the second order. The solution presents no great difficulties. It is known that

 

                                                        

 

hence, this differential equation can be presented in the form

 

                                                         or

 

Integrating this expression, we obtain:

 

                                                                      

 

Taking antilogarithms, we get:

 

                                                                                         

 

But

 

                                                                               

 

and, consequently, the above relation reduces to

 

                                                                                       

 

or

 

                                                                                      

 

Upon integration we obtain:

 

                                                                           

 

Since φ (W) = S, we can write

 

                                                                                                                                                         (3.164)

 

One of the constants in the above equation, k1 can be determined immediately. Taking into account that S = S1 + S2 and W = W1W2, we can write

 

                                                 

 

from which it follows that k1 = 0. Thus we finally get:

 

                                                                                                                                                                           (3.165)

 

To determine the constant k, intricate calculations based on quantum statistics are required. It is impossible to repeat these calculations in this book. It will only be mentioned that after constant k was calculated it happened to be the already known Boltzmann's constant, i.e. the gas constant reduced to one molecule.

Let us again consider a vessel (Fig. 3.19) and imagine it to be separated into two parts and filled with a gas possessing the properties of an ideal, or perfect, gas. As regards the distribution of the molecules of this gas between the two halves of the vessel, it can be stated[2] that with the greatest degree of probability we can expect that the number of molecules in the left-hand part of the vessel is equal to the number of molecules in the right-hand part of the vessel. Actually, however, the event in which the number of molecules in the two parts of the vessel is the same will be rather rarely realized. The number of molecules in the left-hand part of the vessel, for instance, will change continuously due to the chaotic (random) motion of the molecules, and it will be equal to one half of the total number of molecules in the vessel only on the average.

In considering this example it ought to be borne in mind that the number of molecules in the left-hand part of the vessel will always be close to one half of the total number of molecules. That is why not being equal to the maximum probability of uniform distribution of molecules between the two equal parts of the vessel the probability of the actual distribution of the molecules between the two parts of the vessel will always be close to it.

The theory permits the deviation from the most probable state to be determined. The value of the deviation happens to obey the  law, which when applied to the event considered means the following.

Let there be but 200 molecules in the vessel. One half of it must contain 100 molecules on the average. The deviation from this average number of molecules will be

 

                                                  

 

This means that, as a rule, the deviation does not exceed 10% of the average value. The great percentage deviation is due to the fact that the vessel was assumed to contain a very small number of molecules.

If we assume that the vessel contains not 200 but 10 000 molecules, ΔN will be equal to ±100 molecules, or 1%. But if there was 2 × 1020 molecules in the vessel (this is still much less than the number of molecules contained in one mole), the deviation would not exceed 1010 molecules, amounting to only 10-8%. The deviation from the uniform distribution would then be practically imperceptible.

Assume that we have a gas which is absolutely isolated from its surroundings. The gas can then be considered as an absolutely isolated system. This closed system must eventually come into a state of equilibrium, with the entropy of the isolated system reaching a maximum value.

However, even in the case considered, there may be deviations from the state of equilibrium, the so-called fluctuations. As it follows from Eq. (3.165), the maximum value of entropy corresponds to the maximum value of thermodynamic probability. It is, therefore, obvious that all states realized due to fluctuations must differ from the equilibrium state both in a smaller value of the entropy and in a smaller value of the thermodynamic probability. Each such state will be the sooner realized the closer its thermodynamic probability is to the thermodynamic probability of the equilibrium state.

The relative value of possible fluctuations is the greater the smaller the size of the system considered. This can be substantiated convincingly if we consider the calculations made by physicists. Consider a small cube of 1 cm3 in the surrounding atmosphere. What time will elapse before a spontaneous change due to fluctuations in air density by only 1 percent takes place in this space? Calculations show that an observer would probably notice this phenomenon if the survey were conducted in the course of an enormously long period of time, 10140 years. But if the volume of the imagined cube of air is decreased many times and assumed equal to 0.008 of a cubic micron, the density of the air confined in this volume will change by 1 per cent very often, about 109 times per second.

Is it practically possible to observe the change in thermodynamic properties due to fluctuations?

From all that was said above a negative answer suggests itself, the validity of which can be proved by simple calculations. Denote by Sm and Wm respectively the entropy and the thermodynamic probability of an equilibrium state, and by S and W the entropy and the thermodynamic probability of the state reached due to fluctuations. It can, evidently, be written that

 

                                                            

 

whence it follows that

 

                                                                                        

 

Assume that it is as yet possible to determine a difference between system entropies, equal to 10-3 J/K.[3] Knowing that k = 1.38 × 10-23 J/K, we obtain:

 

                                                                   

 

A deviation from the equilibrium state with so large a ratio of probabilities is scarcely probable, so that practical observation of it is impossible. This is the more true that even if the system would reach such a state, the residence time in this state would happen to be so short that no instrument could record it.

Thus, a system in thermodynamic equilibrium continuously undergoes fluctuations from its equilibrium state. The magnitude of these fluctuations from the equilibrium state is so negligible that the practical observation of them is impossible. As regards the application of the fluctuations to engineering purposes, this also proves impossible, and not only from the practical but also from the theoretical point of view

The statistical interpretation of the nature of entropy and of the second law of thermodynamics marked a new step in the explanation of the physical meaning of natural phenomena. Based on the statistical explanation of the second law of thermodynamics, Boltzmann proved that, in principle, no system can rest in the state of full equilibrium, since the system is always fluctuating.

The second law of thermodynamics is applicable only to macroscopic systems, or systems consisting of a very great number of particles. But the principle of increasing entropy in irreversible processes is only true for isolated macroscopic systems. The extension of the principle of increasing entropy to cover systems other than isolated macro systems is in no way justified.

 

 

 



[1] It will be recalled that one kilomole of gas contains 6 × 1026 molecules (Avogadro's number NA). Hence, the mathematical probability that these molecules will concentrate in one half of the confining volume is negligibly small:

 

                                            

 

[2] It is assumed, of course, that the number of molecules is very great.

[3] Actually, it is impossible to determine so small a difference between entropies; the possible error is greater than the value to be determined.