The most probable state of a thermodynamic system. Entropy and thermodynamic probability

Intensity and Extensity Factors

In the course of a thermodynamic process, not only the amount of work, but also the amount of other forms of energy can be considered as a product of two quantities - intensity factor(“generalized force”) and extensiveness factor or containers(“generalized coordinate”).

As such factors, differences in the values ​​of any system parameters are usually considered, which in turn are divided into extensive (the values ​​of which depend on the amount of substance, for example, volume and mass) and intensive (the values ​​of which do not depend on the amount of substance, for example, temperature, pressure). , density, concentration).

The driving force of the process is the intensity factor, i.e., the difference in the values ​​of any intensive parameter in different parts of the system (temperature difference, pressure difference, concentration difference, etc.). In this case, the spontaneous process can go only in the direction of averaging the intensive parameter.


The state of a thermodynamic system, as mentioned earlier, is characterized by certain values ​​of density, pressure, temperature, and other quantities characterizing the system. These quantities determine the state of the system as a whole, that is, its macrostate. However, at the same density, temperature, etc., the particles that make up the system can be located in different places in its volume and have different values ​​of energy or momentum. Each state of a system with a certain distribution of its particles over possible classical or quantum states is called microstate. The number of microstates that implement a given macrostate, or otherwise, the number of ways in which a given state of the system can be realized is called thermodynamic probability W . It is clear from the definition that W ³ 1. The thermodynamic probability can be equal to one only in one case - when the temperature of the system is equal to absolute zero and there is no thermal motion in it. Under normal conditions, in systems that have to be dealt with in practice, and which consist of a very large number of molecules and other particles, W much more than unity.

For ideal gases, the value W can be quite easily calculated by statistical thermodynamics, but for liquids and solids such a calculation is much more complicated.

Spontaneous processes in the system go in the direction of increasing its thermodynamic probability. Therefore, the value W can be considered as one of the criteria for the possibility of the occurrence of certain processes. However, even when the values W can be calculated with sufficient accuracy, their use in practical calculations is difficult due to the truly “astronomical” numbers by which they are expressed.

STATISTICAL WEIGHT

The concept of " statistical weight" (also used the term thermodynamic probability) is one of the main ones in statistical physics. To formulate its definition, it is necessary first to define the concepts macrostate and microstate.

The same state macroscopic body can be characterized in different ways. If the state is characterized by the task macroscopic state parameters (pressure, volume, temperature, density, etc.), then such a state will be called macrostate .

If the state is characterized by specifying the coordinates and velocities of all body molecules, then we will call such a state microstate .

Obviously, the same macrostate can be realized in different ways, that is, different microstates. The number of different microstates with which a given macrostate can be realized is called statistical weight or thermodynamic probability .

To clarify these concepts, consider model(!) - the vessel in which they are N molecules. Suppose that the vessel is divided into two identical parts, and different macrostates differ in the number of molecules in the left and right halves of the vessel. That's why within the model we assume the state of a molecule is given if it is known in which of the halves of the vessel it is located.

Different microstates differ in the fact which molecules are on the right and on the left. 1.2 - 3.4 (as shown in figure 9.5) one of the states. 1.3 - 2.4 - another microstate.

Each of the molecules can be with equal probability both to the left and to the right. Therefore, the probability i -that molecule to be, for example, on the right is equal to ½. The appearance in the left side of the vessel of that molecule, along with the one is statistically independent event , so the probability of finding two molecules on the left is ½ ½ = ¼; three molecules - 1/8; four - 1/16, etc. Therefore, the probability of any arrangement (microstate) of molecules is equal to .

The assertion that the probabilities of each of their microstates are equal, are called ergodic hypothesis , and it underlies statistical physics.

Consider N = 4. Each of the arrangements of molecules in the halves of the vessel is a specific microstate. Then the macrostate with the number of molecules on the left corresponds to 1 microstate. The statistical weight of such a macrostate is 1, and the probability of its realization is 1/16. For other macrostays, we can state the following:

Corresponds to 6 microstates statistical weight 6, 6/16

Corresponds to 4 microstates statistical weight 4, 4/16

Corresponds to 1 microstate statistical weight 1.1/16

Now you can see that due to the adoption of the ergodic hypothesis, the statistical weight is proportional to the probability (common!) implementation of this macrostate.

If the container contains N molecules, then we can prove that the statistical weight of the macrostate, which consists in the fact that on the left n molecules, and on the right (N - n)

(9.25)

If for four molecules the probability of gathering in one of the halves of the vessel is 1/16, that is, a quite tangible value, then for N = 24 this probability is about .

Under normal conditions, 4 cm 3 of air contains about 10 20 molecules. The probability of them gathering in one of the parts of the vessel is estimated by the value .

Thus, with an increase in the number molecules in the system, the probability of significant deviations from the approximate equality of the numbers of molecules in the parts of the vessel decreases very quickly. This corresponds to the fact that the statistical weight of states with an approximately equal number of molecules in the halves turns out to be very large and rapidly decreases as the deviation from the equality of molecules in the parts occurs.

If number N is not very large, then over time there are observed - noticeable deviations in the number of molecules in one of the half from N/2 . Random deviations of a physical quantity x from its mean value are called fluctuations:

. (9.26)

Arithmetic mean of absolute fluctuation equals zero. Therefore, as a characteristic of fluctuations, one often considers root mean square fluctuation :

More convenient and indicative is relative fluctuation :

Moreover, in statistical physics, the relation is proved:

, (9.28)

those. the magnitude of the relative fluctuation is inversely proportional to the root of the number of particles in the system . This statement confirms our qualitative conclusion.

Similar to the number of molecules in one of the halves of the vessel, other macroscopic characteristics of the state fluctuate near the average values ​​- pressure, density, etc.

Consider nature equilibrium and non-equilibrium states and processes from the point of view of statistical physics. equilibrium, by definition, is a state that does not tend to change over time. It is clear that the most probable of all macrostates of the system, that is, the state realized by the largest number of microstates, and therefore having the greatest statistical weight, will have this property to the greatest extent. That's why equilibrium state can be defined as a state whose statistical weight is maximum .

An example of a typical irreversible process is the spread of gas molecules, initially concentrated in one of its halves, to the entire volume of the vessel. This process is irreversible, since the probability that as a result of thermal movement all the molecules will gather in one of the halves of the vessel is very small. Accordingly, always process is irreversible, the reverse of which is extremely unlikely .


LECTURE № 10 STATIC PHYSICS AND THERMODYNAMICS

10.1. ENTROPY

As we have established, the probability of the state of the system is proportional to its static weight, so the stat weight W itself could be used as a characteristic of the probability of the state. However, W is not an additive quantity. Therefore, to characterize the state of the system, the value is used

called entropy systems. Indeed, if we consider two systems with 4 molecules in each, then the statistical weight of the state when each of the subsystems contains, for example, one molecule on the left will be equal to 16, i.e. . This relation is valid for any states. Consequently, statweight is non-additive. In the same time entropy states of the resulting system those. is the additive quantity.



Since when irreversible processes occur in an isolated system, it passes from less probable to more probable states, it can be argued that the entropy of an isolated system increases when irreversible processes occur in it .

The equilibrium state is the most probable state, which means that the entropy of a system that has passed into an equilibrium state is maximum.

Therefore, it can be argued that the entropy of an isolated system remains constant if it is in equilibrium, or increases if irreversible processes take place in it.

The assertion that The entropy of an isolated system does not decrease, is called second law of thermodynamics or law of increasing entropy .

Entropy is, obviously, state function and should be determined by the state parameters. A monatomic ideal gas has the simplest properties - its states are completely determined by setting two parameters, for example, temperature and volume. Accordingly, its entropy can be defined as a function of temperature and volume: . The corresponding calculations show that the entropy of a mole of an ideal gas is given by

where - is some constant, up to which the entropy is determined.

Now you can find out the question of how the entropy changes non-isolated system, for example, when a certain amount of heat is communicated to it. Take the differential (2) and multiply it by:

(3)

But the increment of the internal energy of the gas. Because equality .Then (3) is converted to the form:

Included in (4) are additive , and therefore (4) is valid for any mass of gas .

Thermodynamic Probability

S=k ln W-

this is the Boltzmann formula,

where S- entropy - the degree of disorder of the system;

k– Boltzmann's constant;

W- thermodynamic probability of a system of macrostates.

– the number of microstates of the given system, with the help of which it is possible to realize the given macrostate of the system (P, T, V).

If a W= 1, then S= 0, at a temperature of absolute zero –273°C, all types of motion cease.

Thermodynamic Probability is the number of ways in which atoms and molecules can be distributed in a volume.

Carnot cycle

Carnot cycle- a circular thermal process, as a result of which a certain amount of heat is transferred in a thermodynamically reversible way from a hot body to a cold one. The process must be carried out in such a way that the bodies between which there is a direct exchange of energy are at a constant temperature, i.e., both hot and cold bodies are considered to be such large thermal reservoirs that the temperature of the first when taken away and the temperature of the second when the considered amount of heat is added is perceptible do not change. This requires a "working body". The working fluid in this cycle is 1 mole of an ideal gas. All processes that make up the Carnot cycle are reversible. Let's consider them. Figure 9 shows:

AB - isothermal expansion of gas from V 1 before V 2 at a temperature T 1 , the amount of heat Q1 absorbed;

Sun - adiabatic expansion from V 2 before V 3, the temperature drops from T 1 to T 2 ;

CD- isothermal compression from V 3 before V 4 carried out at a temperature T 2 , quantity of heat Q given;

DA- adiabatic compression from V 4 before V 1 , the temperature increases from T 2 before T 1 .

Let's analyze it in detail. The process requires a "working fluid", which at first at a higher temperature T 1 is brought into contact with a hot body and isothermally receives from it the specified amount of heat. It then cools adiabatically to a temperature T 2 , giving off heat at this temperature to a cold body with a temperature T 2 , and then adiabatically returns to the initial state. In the Carnot cycle? U= 0. During the cycle, the "working fluid" received the amount of heat Q 1 - Q 2 and did the work BUT, equal to the area of ​​the cycle. So, according to the first law of thermodynamics Q 1 - Q 2 \u003d A, we get.

Definition 1

Thermodynamic probability - the number of methods by which it is possible to realize any state of a macroscopic physical system.

Figure 1. Entropy and probability. Author24 - online exchange of student papers

In thermodynamics, the position of the concept is characterized by specific values ​​of density, temperature, pressure and other measured quantities. The listed parameters determine the further state of the system as a whole, but with the same density, elementary particles can be located in different places in its volume and have completely different values ​​of momentum or energy.

Definition 2

Each state of a thermodynamic system with a certain division of its particles according to probable quantum or classical positions is called a microstate in physics.

The thermodynamic probability is equated to the number of microstates that realize the existing macrostate. Such a process is not a probability in the mathematical aspect, therefore, it is used in statistical physics to determine the properties of a concept that is in thermodynamic, constant equilibrium.

For an accurate calculation of probability in thermodynamics, it is essential whether the same elements of the system are considered indistinguishable or different. Therefore, quantum and classical mechanics lead to completely different expressions for the thermodynamic probability.

Features of probability in thermodynamics

Figure 2. Thermodynamic probability. Author24 - online exchange of student papers

Remark 1

The main advantage of thermodynamics is that it helps to consider the general properties of the concept at equilibrium and the general laws for determining density, to obtain important information about the substance itself, without fully knowing its initial internal structure.

Its laws and methods are applicable to any material body, to any systems that include magnetic and electric fields, so they have become the foundations in such areas:

  • gas and condensed media;
  • chemistry and technology;
  • necessary in the physics of the Universe and geophysics;
  • biology and control of physical processes.

The researcher Boltzmann considered the atomistic theory to be well founded. An infinite or huge number of particles makes the mechanical effect impossible and needs a statistical description. The mathematical tool of modern statistics is the calculus and determination of probabilities. Boltzmann proved that since the basis of thermodynamic processes are kinetic reversible processes, the irreversibility in the entropy measured by thermodynamics cannot be absolute in practice. Therefore, entropy should be directly related to the possibility of realizing a given microstate.

The concept of probability, implicitly applied by Maxwell, Boltzmann used to overcome difficulties related to understanding the second law of thermodynamics and the theory of "heat death of the Universe". The pinnacle of Boltzmann's scientific work was the establishment of the relationship between thermodynamic probability and entropy. Planck introduced this connection through the introduction of the constant $k = R / N$, which is called the Boltzmann constant.

Thus, an irreversible physical process is a smooth transition from a less probable position to a more probable one, and the logarithm of the change in the initial state, up to a stable factor, completely coincides with the movement of entropy. Boltzmann used this effect for an ideal gas.

The higher the level of disorder in the speeds and coordinates of the particles of the system, the greater the possibility that the concept will be in a state of chaos. The Boltzmann formula can be considered as the basic definition of entropy.

Probability calculation in systems

If the system is very large, and its initial position is not too close to the state of equilibrium, then the transitions of substances to less probable states will be practically impossible, which in practice they have absolutely no significance. Then the entropy increase law is justified experimentally with absolute certainty.

Let us calculate the exact probability of such physical processes. Let there be only one molecule in a certain vessel. Then, in the absence of external force fields, an elementary particle with equal probability can end up either in part 1 or in part 2. The probabilities of such a hit are the same and are written as follows:

After the second molecule enters the vessel, their hits will always be independent states, since the elements of an ideal gas do not interact with each other. If the distribution of atoms in the vessel is photographed for a long time through equal intermediate positions, then for every 1000 frames there will be approximately one frame on average, in which all molecules will be fixed only in part of the vessel 1. A similar phenomenon can be observed in part 2.

According to the probability addition hypothesis, we get an average of 2 frames for every thousand with elementary particles concentrated in any part of the system. All this is not only quite possible in principle, but actually accessible to ordinary observation. There is practically no chance of fixing the corresponding fluctuation. If the amount of Avogadro is equal to the temperature index, the corresponding probability is so small that such possibilities and the conditions corresponding to them can be completely disregarded.

The difference between thermodynamic and mathematical systems

To date, scientists share two main probabilities in thermodynamics:

  • mathematical;
  • thermodynamic.

Thermodynamic probability is a certain number of microstates, through which it is possible to carry out the necessary macrostate of the concept. To find the thermodynamic probability of its initial state, one should count the number of combinations that will help to realize any spatial distribution of elementary particles.

The mathematical probability of a state is equal to the ratio of the thermodynamic possibility to the total value of possible microstates. Mathematical probability is always less than one unit, while probability in thermodynamics is expressed in large numbers. Probability in mathematics is not additive and is directly related not to the thermal features of the system, but to mechanical ones, for example, to the movement of molecules in a medium and their speed.

One and the same macrostate can correspond to many minor microstates. According to L. Boltzmann, the greater the number of such provisions that a particular macrostate can be realized, the more likely it is in practice. The thermodynamic probability of the state of the concept is the number of microstates that eventually realize the macrostate.

When using these methods, it must be borne in mind that the conclusions based on it are considered the most probable only in the thermodynamic issue, and indicate only the possibility or impossibility of a particular physical process. In real conditions, slight deviations from the conclusions drawn are not ruled out, and the occurring phenomena may, under individual circumstances, be different from those that acted on the basis of general thermodynamic considerations.

Page 1


The thermodynamic probability of a state W and the entropy of an isolated system 5 are different measures of the system's striving for equilibrium. Both quantities increase during irreversible processes that bring the system closer to equilibrium, and reach a maximum at the equilibrium state of the system. There is a quantitative relationship between the values ​​of W and S. The general form of this relationship is easy to establish if we take into account the additivity of entropy, which is the sum of the entropy of individual parts of an equilibrium system, and the multiplicativity of the probability of a complex event, which is the product of the probabilities of individual independent events.

The thermodynamic probability of a state W and the entropy of an isolated system 5 are different measures of the system's striving for equilibrium. Both quantities increase during irreversible processes that bring the system closer to equilibrium, and reach a maximum at the equilibrium state of the system. There is a quantitative relationship between the values ​​of W and S. The general form of this relationship is not difficult to establish if we take into account the additivity of entropy, which is the sum of the entropy of individual parts of an equilibrium system, and the multiplicativity of the probability of a complex event, which is the product of the probabilities of individual independent events.

The thermodynamic probability of a state W and the entropy of an isolated system S are different measures of the system's tendency to equilibrium. Both quantities increase during irreversible processes that bring the system closer to equilibrium, and reach a maximum at the equilibrium state of the system. There is a quantitative relationship between the values ​​W and 5. The general form of this relationship is easy to establish if we take into account the additivity of entropy, which is the sum of the entropy of individual parts of an equilibrium system, and the multiplicativity of the probability of a complex event, which is the product of the probabilities of individual independent events.

The thermodynamic probability of a state is the number of microstates of a system corresponding to a given macrostate (p. The value of P for a chemically homogeneous system shows in how many ways a given quantitative distribution of particles over the cells of the phase space can be realized, regardless of which cell this or that particular particle is located.

The thermodynamic probability of a state of a system is the number of microstates through which a given state can be realized. Applying the theory of probability, the laws of which, in combination with the laws of mechanics, form statistical mechanics, it is possible, on the one hand, to determine the relationship between thermodynamic probability and entropy, and on the other hand, to determine the thermodynamic probability of a state.

We determine the thermodynamic probability W of the state of the system from SA / oscillators that have received a total of n energy quanta. These n quanta can be distributed among the 3N degrees of freedom in different ways.

Under the thermodynamic probability of a state is meant the numerator of a fraction expressing the probability of this state in its usual sense.

A quantitative measure of the thermodynamic probability of the state w is the number of different microstates that can be used to realize a macrostate characterized by given thermodynamic parameters.

What is called the thermodynamic probability of a state and how is it related to entropy.

The initial concept is the thermodynamic probability of the state of the system W .

Let us now consider the connection between the thermodynamic probability of the state of the system and entropy.

Boltzmann; W is the thermodynamic probability of a state, determined by the number of microstates realizing the given microstate. Relation (3.49) expresses the Boltzmann principle. The unilateral nature of the change in entropy in a closed system is determined by the transition of the system from a less probable state to a more probable one.

Boltzmann; w is the thermodynamic probability of a state, determined by the number of microstates realizing a given macrostate. Relation (3.49) expresses the Boltzmann principle. The unilateral nature of the change in entropy - in a closed system is determined by the transition of the system from a less probable state to a more probable one.

The entropy S is related to the thermodynamic probability of the state W by the known relation Sk nW, where k is Boltzmann's constant.

Statistical weight O or thermodynamic probability of a state of a thermodynamic system is the number of microstates with the help of which a given macrostate is realized.