How to read the second law of thermodynamics. Entropy

23.09.2019

§6 Entropy

Typically, any process in which a system passes from one state to another proceeds in such a way that it is impossible to carry out this process in the opposite direction so that the system passes through the same intermediate states without any changes occurring in the surrounding bodies. This is due to the fact that in the process part of the energy is dissipated, for example, due to friction, radiation, etc. Thus. Almost all processes in nature are irreversible. In any process, some energy is lost. To characterize energy dissipation, the concept of entropy is introduced. ( The entropy value characterizes the thermal state of the system and determines the probability of the implementation of a given state of the body. The more probable a given state is, the greater the entropy.) All natural processes are accompanied by an increase in entropy. Entropy remains constant only in the case of an idealized reversible process occurring in a closed system, that is, in a system in which there is no exchange of energy with bodies external to this system.

Entropy and its thermodynamic meaning:

Entropy- this is a function of the state of the system, the infinitesimal change of which in a reversible process is equal to the ratio of the infinitesimal amount of heat introduced in this process to the temperature at which it was introduced.

In a final reversible process, the change in entropy can be calculated using the formula:

where the integral is taken from the initial state 1 of the system to the final state 2.

Since entropy is a function of state, then the property of the integralis its independence from the shape of the contour (path) along which it is calculated; therefore, the integral is determined only by the initial and final states of the system.

  • In any reversible process, the change in entropy is 0

(1)

  • In thermodynamics it is proven thatSsystem undergoing an irreversible cycle increases

Δ S> 0 (2)

Expressions (1) and (2) relate only to closed systems; if the system exchanges heat with the external environment, then itsScan behave in any way.

Relations (1) and (2) can be represented as the Clausius inequality

ΔS ≥ 0

those. the entropy of a closed system can either increase (in the case of irreversible processes) or remain constant (in the case of reversible processes).

If the system makes an equilibrium transition from state 1 to state 2, then the entropy changes

Where dU And δAis written for a specific process. According to this formula ΔSdetermined up to an additive constant. It is not entropy itself that has a physical meaning, but the difference in entropies. Let's find the change in entropy in ideal gas processes.

those. entropy changesS Δ S 1→2 of an ideal gas during its transition from state 1 to state 2 does not depend on the type of process.

Because for an adiabatic process δQ = 0, then Δ S= 0 => S= const , that is, an adiabatic reversible process occurs at constant entropy. That is why it is called isentropic.

In an isothermal process (T= const; T 1 = T 2 : )

In an isochoric process (V= const; V 1 = V 2 ; )

Entropy has the property of additivity: the entropy of a system is equal to the sum of the entropies of the bodies included in the system.S = S 1 + S 2 + S 3 + ... The qualitative difference between the thermal motion of molecules and other forms of motion is its randomness and disorder. Therefore, to characterize thermal motion, it is necessary to introduce a quantitative measure of the degree of molecular disorder. If we consider any given macroscopic state of a body with certain average values ​​of parameters, then it is something other than a continuous change of close microstates that differ from each other in the distribution of molecules in different parts of the volume and the distributed energy between the molecules. The number of these continuously changing microstates characterizes the degree of disorder of the macroscopic state of the entire system,wis called the thermodynamic probability of a given microstate. Thermodynamic probabilitywstate of a system is the number of ways in which a given state of a macroscopic system can be realized, or the number of microstates that implement a given microstate (w≥ 1, and mathematical probability ≤ 1 ).

As a measure of the surprise of an event, it was agreed to take the logarithm of its probability, taken with a minus sign: the surprise of the state is equal to =-

According to Boltzmann, entropySsystems and thermodynamic probability are related to each other as follows:

Where - Boltzmann constant (). Thus, entropy is determined by the logarithm of the number of states with the help of which a given microstate can be realized. Entropy can be considered as a measure of the probability of the state of the t/d system. Boltzmann's formula allows us to give entropy the following statistical interpretation. Entropy is a measure of the disorder of a system. In fact, the greater the number of microstates realizing a given microstate, the greater the entropy. In the state of equilibrium of the system - the most probable state of the system - the number of microstates is maximum, and entropy is also maximum.

Because real processes are irreversible, then it can be argued that all processes in a closed system lead to an increase in its entropy - the principle of increasing entropy. In the statistical interpretation of entropy, this means that processes in a closed system proceed in the direction of increasing the number of microstates, in other words, from less probable states to more probable ones, until the probability of the state becomes maximum.

§7 Second law of thermodynamics

The first law of thermodynamics, expressing the law of conservation of energy and energy transformation, does not allow us to establish the direction of the flow of t/d processes. In addition, one can imagine many processes that do not contradictIto the beginning t/d, in which energy is conserved, but in nature they are not realized. Possible formulations of the second beginning t/d:

1) the law of increasing entropy of a closed system during irreversible processes: any irreversible process in a closed system occurs in such a way that the entropy of the system increases ΔS≥ 0 (irreversible process) 2) ΔS≥ 0 (S= 0 for reversible and ΔS≥ 0 for an irreversible process)

In processes occurring in a closed system, entropy does not decrease.

2) From Boltzmann's formula S = , therefore, an increase in entropy means a transition of the system from a less probable state to a more probable one.

3) According to Kelvin: a circular process is not possible, the only result of which is the conversion of the heat received from the heater into work equivalent to it.

4) According to Clausius: a circular process is not possible, the only result of which is the transfer of heat from a less heated body to a more heated one.

To describe t/d systems at 0 K, the Nernst-Planck theorem (third law of t/d) is used: the entropy of all bodies in a state of equilibrium tends to zero as the temperature approaches 0 K

From the theorem Nernst-Planck it follows thatC p = C v = 0 at 0 TO

§8 Heat and refrigeration machines.

Carnot cycle and its efficiency

From the formulation of the second law of t/d according to Kelvin it follows that a perpetual motion machine of the second kind is impossible. (A perpetual motion machine is a periodically operating engine that performs work by cooling one heat source.)

Thermostat is a t/d system that can exchange heat with bodies without changing temperature.

Operating principle of a heat engine: from a thermostat with temperature T 1 - heater, the amount of heat is removed per cycleQ 1 , and the thermostat with temperature T 2 (T 2 < T 1) - to the refrigerator, the amount of heat is transferred per cycleQ 2 , while work is done A = Q 1 - Q 2

Circular process or cycle is a process in which a system, having gone through a series of states, returns to its original state. In a state diagram, a cycle is depicted as a closed curve. The cycle performed by an ideal gas can be divided into processes of expansion (1-2) and compression (2-1), the work of expansion is positive A 1-2 > 0, becauseV 2 > V 1 , the compression work is negative A 1-2 < 0, т.к. V 2 < V 1 . Consequently, the work done by the gas per cycle is determined by the area covered by the closed curve 1-2-1. If positive work is done during a cycle (clockwise cycle), then the cycle is called forward, if it is a reverse cycle (the cycle occurs in a counterclockwise direction).

Direct cycle used in heat engines - periodically operating engines that perform work using heat received from outside. The reverse cycle is used in refrigeration machines - periodically operating installations in which, due to the work of external forces, heat is transferred to a body with a higher temperature.

As a result of the circular process, the system returns to its original state and, therefore, the total change in internal energy is zero. ThenІ start t/d for circular process

Q= Δ U+ A= A,

That is, the work done per cycle is equal to the amount of heat received from outside, but

Q= Q 1 - Q 2

Q 1 - quantity heat received by the system,

Q 2 - quantity heat given off by the system.

Thermal efficiency for a circular process is equal to the ratio of the work done by the system to the amount of heat supplied to the system:

For η = 1, the condition must be satisfiedQ 2 = 0, i.e. a heat engine must have one heat sourceQ 1 , but this contradicts the second law of t/d.

The reverse process that occurs in a heat engine is used in a refrigeration machine.

From the thermostat with temperature T 2 the amount of heat is taken awayQ 2 and is transmitted to the thermostat with temperatureT 1 , quantity of heatQ 1 .

Q= Q 2 - Q 1 < 0, следовательно A< 0.

Without doing work, it is impossible to take heat from a less heated body and give it to a more heated one.

Based on the second law of t/d, Carnot derived a theorem.

Carnot's theorem: from all periodically operating heat engines having the same heater temperatures ( T 1) and refrigerators ( T 2), highest efficiency. have reversible machines. Efficiency reversible machines with equal T 1 and T 2 are equal and do not depend on the nature of the working fluid.

A working body is a body that performs a circular process and exchanges energy with other bodies.

The Carnot cycle is a reversible, most economical cycle, consisting of 2 isotherms and 2 adiabats.

1-2 isothermal expansion at T 1 heater; heat is supplied to the gasQ 1 and work is done

2-3 - adiabat. expansion, gas does workA 2-3 >0 over external bodies.

3-4 isothermal compression at T 2 refrigerators; heat is removedQ 2 and work is done;

4-1-adiabatic compression, work is done on the gas A 4-1 <0 внешними телами.

In an isothermal processU= const, so Q 1 = A 12

1

During adiabatic expansionQ 2-3 = 0, and gas work A 23 accomplished by internal energy A 23 = - U

Quantity of heatQ 2 , given by the gas to the refrigerator during isothermal compression is equal to the work of compression A 3-4

2

Adiabatic compression work

Work done as a result of a circular process

A = A 12 + A 23 + A 34 + A 41 = Q 1 + A 23 - Q 2 - A 23 = Q 1 - Q 2

and is equal to the area of ​​the curve 1-2-3-4-1.

Thermal efficiency Carnot cycle

From the adiabatic equation for processes 2-3 and 3-4 we obtain

Then

those. efficiency The Carnot cycle is determined only by the temperatures of the heater and refrigerator. To increase efficiency need to increase the difference T 1 - T 2 .

******************************************************* ******************************************************

There are several formulations of the second law of thermodynamics, two of which are given below:

· heat cannot by itself move from a body with a lower temperature to a body with a higher temperature(formulation by R. Clausius);

· a perpetual motion machine of the second kind is impossible, that is, such a periodic process, the only result of which would be the conversion of heat into work due to the cooling of one body (Thomson’s formulation).

The second law of thermodynamics indicates the inequality of two forms of energy transfer - work and heat. This law takes into account the fact that the process of transition of the energy of ordered motion of a body as a whole (mechanical energy) into the energy of disordered motion of its particles (thermal energy) is irreversible. For example, mechanical energy during friction is converted into heat without any additional processes. The transition of the energy of disordered particle motion (internal energy) into work is possible only if it is accompanied by some additional process. Thus, a heat engine operating in a direct cycle produces work only due to the heat supplied from the heater, but at the same time part of the received heat is transferred to the refrigerator.

Entropy. In addition to internal energy U, which is a unique function of the state parameters of the system; other state functions are widely used in thermodynamics ( free energy, enthalpy And entropy).

Concept entropy introduced in 1865 by Rudolf Clausius. This word comes from the Greek. entropia and literally means turn, transformation. in thermodynamics, this term is used to describe the transformation of various types of energy (mechanical, electrical, light, chemical) into heat, that is, into the random, chaotic movement of molecules. It is impossible to collect this energy and transform it back into the species from which it was obtained.

For determining measures of irreversible scattering or dissipation energy and this concept was introduced. Entropy S is a function of state. It stands out among other thermodynamic functions in that it has statistical, that is, probabilistic nature.



If a process involving the receipt or release of heat occurs in a thermodynamic system, this leads to a transformation of the entropy of the system, which can either increase or decrease. During an irreversible cycle, the entropy of an isolated system increases

dS> 0. (3.4)

This means that irreversible energy dissipation occurs in the system.

If a reversible process occurs in a closed system, the entropy remains unchanged

dS= 0. (3.5)

The change in entropy of an isolated system to which an infinitesimal amount of heat is imparted is determined by the relation:

. (3.6)

This relationship is valid for a reversible process. For an irreversible process occurring in a closed system, we have:

dS> .

In an open system, entropy always increases. The state function whose differential is is called reduced heat.

Thus, in all processes occurring in a closed system, entropy increases during irreversible processes and remains unchanged during reversible processes. Consequently, formulas (3.4) and (3.5) can be combined and presented in the form

dS ³ 0.

This statistical formulation of the second law of thermodynamics.

If the system makes an equilibrium transition from state 1 to state 2, then according to equation (3.6) , entropy change

D S 1- 2 = S 2 – S 1 = .

It is not entropy itself that has a physical meaning, but the difference in entropies.

Let's find the change in entropy in ideal gas processes. Because the:

; ;

,

or: . (3.7)

This shows that the change in the entropy of an ideal gas during the transition from state 1 to state 2 does not depend on the type of transition process 1® 2.

From formula (3.7) it follows that when isothermal process ( T 1 = T 2):

.

At isochoric process, the change in entropy is equal to

.

Since for an adiabatic processd Q= 0, then uD S= 0, therefore, a reversible adiabatic process occurs at constant entropy. That's why they call him isentropic process.

The entropy of a system has the property of additivity, which means that the entropy of the system is equal to the sum of the entropies of all bodies that are included in the system.

The meaning of entropy becomes clearer if we involve statistical physics. In it, entropy is associated with thermodynamic probability of the system state. The thermodynamic probability W of the state of the system is equal to the number of all possible microdistributions of particles along coordinates and velocities, which determines a given macrostate: Walways³ 1, that is thermodynamic probability is not probability in the mathematical sense.

L. Boltzmann (1872) showed that the entropy of a system is equal to the product of Boltzmann's constant k by the logarithm of the thermodynamic probability W of a given state

Consequently, entropy can be given the following statistical interpretation: entropy is a measure of the disorder of a system. From formula (3.8) it is clear: the greater the number of microstates that realize a given macrostate, the greater the entropy. The most probable state of the system is an equilibrium state. The number of microstates is maximum, therefore, entropy is maximum.

Since all real processes are irreversible, it can be argued that all processes in a closed system lead to an increase in entropy - the principle of increasing entropy.

In the statistical interpretation of entropy, this means that processes in a closed system proceed in the direction from less probable states to more probable states until the probability of states becomes maximum.

Let's explain with an example. Let's imagine a vessel divided by a partition into two equal parts A And B. In part A there is gas, and in B- vacuum. If you make a hole in the partition, the gas will immediately begin to expand “by itself” and after some time will be evenly distributed throughout the entire volume of the vessel, and this will most likely state of the system. Least likely there will be a state when most of the gas molecules suddenly spontaneously fill one of the halves of the vessel. You can wait for this phenomenon as long as you like, but the gas itself will not reassemble into parts. A. To do this, you need to do some work on the gas: for example, move the right wall of a part like a piston B. Thus, any physical system tends to move from a less probable state to a more probable state. The equilibrium state of the system is more probable.

Using the concept of entropy and R. Clausius’ inequality, second law of thermodynamics can be formulated as the law of increasing entropy of a closed system during irreversible processes:

any irreversible process in a closed system occurs in such a way that the system is more likely to enter a state with higher entropy, reaching a maximum in a state of equilibrium. Or else:

in processes occurring in closed systems, entropy does not decrease.

Please note that we are talking only about closed systems.

So, the second law of thermodynamics is a statistical law. It expresses the necessary patterns of chaotic motion of a large number of particles that are part of an isolated system. However, statistical methods are applicable only in the case of a huge number of particles in the system. For a small number of particles (5-10) this approach is not applicable. In this case, the probability of all particles being in one half of the volume is no longer zero, or in other words, such an event can occur.

Heat Death of the Universe. R. Clausius, considering the Universe as a closed system, and applying the second law of thermodynamics to it, reduced everything to the statement that the entropy of the Universe must reach its maximum. This means that all forms of motion must turn into thermal motion, as a result of which the temperature of all bodies in the Universe will become equal over time, complete thermal equilibrium will occur, and all processes will simply stop: the thermal death of the Universe will occur.

Basic equation of thermodynamics . This equation combines the formulas of the first and second laws of thermodynamics:

d Q = dU + p dV, (3.9)

Let us substitute equation (3.9), expressing the second law of thermodynamics, into equality (3.10):

.

That's what it is fundamental equation of thermodynamics.

In conclusion, we note once again that if the first law of thermodynamics contains the energy balance of the process, then the second law shows its possible direction.

Third law of thermodynamics

Another law of thermodynamics was established in the process of studying changes in the entropy of chemical reactions in 1906 by V. Nernst. It's called Nernst's theorem or third law of thermodynamics and is associated with the behavior of the heat capacity of substances at absolute zero temperatures.

Nernst's theorem states that when approaching absolute zero, the entropy of the system also tends to zero, regardless of what values ​​all other parameters of the system’s state take:

.

Since entropy , and the temperature T tends to zero, the heat capacity of the substance must also tend to zero, and faster than T. this implies unattainability of absolute zero temperature with a finite sequence of thermodynamic processes, that is, a finite number of operations - operating cycles of the refrigeration machine (the second formulation of the third law of thermodynamics).

Real gases

Van der Waals equation

The change in the state of rarefied gases at sufficiently high temperatures and low pressures is described by the ideal gas laws. However, as the pressure increases and the temperature of a real gas decreases, significant deviations from these laws are observed, due to significant differences between the behavior of real gases and the behavior that is attributed to particles of an ideal gas.

The equation of state of real gases must take into account:

· final value of the molecules’ own volume;

· mutual attraction of molecules to each other.

For this, J. van der Waals proposed to include in the equation of state not the volume of the vessel, as in the Clapeyron-Mendeleev equation ( pV = RT), and the volume of a mole of gas not occupied by molecules, that is, the value ( V m -b), Where V m – molar volume. To take into account the forces of attraction between molecules, J. van der Waals introduced a correction to the pressure included in the equation of state.

By introducing corrections related to taking into account the intrinsic volume of molecules (repulsive forces) and attractive forces into the Clapeyron-Mendeleev equation, we obtain equation of state of a mole of real gas as:

.

This van der Waals equation, in which the constants A And b have different meanings for different gases.

Laboratory work

The second law of thermodynamics determines the direction of real thermal processes occurring at a finite speed.

Second beginning(second law) thermodynamics It has several formulations . For example, any action, related to energy conversion(that is, with the transition of energy from one form to another), cannot occur without its loss in the form of heat dissipated in the environment. In a more general form, this means that the processes of transformation (transformation) of energy can occur spontaneously only under the condition that the energy passes from a concentrated (ordered) form to a dispersed (disordered) form.

Another definition The second law of thermodynamics is directly related to Clausius principle : A process in which no change occurs other than the transfer of heat from a hot body to a cold one is irreversible, that is, heat cannot transfer spontaneously from a colder body to a hotter one. Wherein such a redistribution of energy in the system characterized by the magnitude , called entropy , which, as a function of the state of a thermodynamic system (a function having a total differential), was first introduced in 1865 year precisely by Clausius. Entropy – it is a measure of the irreversible dissipation of energy. The greater the amount of energy that is irreversibly dissipated as heat, the greater the entropy.

Thus, from these formulations of the second law of thermodynamics we can conclude that any system , whose properties change over time, strives for an equilibrium state, in which entropy of the system takes maximum value. Due to this second law of thermodynamics often call law of increasing entropy , and herself entropy (as a physical quantity or as a physical concept) are considering as a measure of the internal disorder of a physicochemical system .

In other words, entropy state function characterizing the direction of spontaneous processes in a closed thermodynamic system. In a state of equilibrium, the entropy of a closed system reaches a maximum and no macroscopic processes are possible in such a system. Maximum entropy corresponds to complete chaos .

Most often, the transition of a system from one state to another is characterized not by the absolute value of entropy S , and its change ∆ S , which is equal to the ratio of the change in the amount of heat (communicated to the system or removed from it) to the absolute temperature of the system: ∆ S= Q/T J/deg. This is the so-called thermodynamic entropy .

In addition, entropy also has a statistical meaning. When transitioning from one macrostate to another, statistical entropy also increases, since such a transition is always accompanied by a large number of microstates, and the equilibrium state (to which the system tends) is characterized by the maximum number of microstates.

In connection with the concept of entropy in thermodynamics, the concept of time takes on a new meaning. In classical mechanics, the direction of time is not taken into account and the state of a mechanical system can be determined both in the past and in the future. In thermodynamics, time appears in the form of an irreversible process of increasing entropy in a system. That is, the greater the entropy, the longer the time period the system has passed in its development.

Besides, to understand the physical meaning of entropy it must be kept in mind that There are four classes of thermodynamic systems in nature :

A) isolated systems or closed(during the transition of such systems from one state to another, there is no transfer of energy, matter and information across the boundaries of the system);

b) adiabatic systems(there is only no heat exchange with the environment);

V) closed systems(exchange energy, but not matter, with neighboring systems (for example, a spaceship);

G) open systems(exchange matter, energy and information with the environment). In these systems, due to the arrival of energy from the outside, dissipative structures with much lower entropy can arise.

For open systems, entropy decreases. The latter primarily concerns biological systems, that is, living organisms, which are open nonequilibrium systems. Such systems are characterized by gradients in the concentration of chemicals, temperature, pressure and other physicochemical quantities. The use of the concepts of modern, that is, nonequilibrium thermodynamics, makes it possible to describe the behavior of open, that is, real systems. Such systems always exchange energy, matter and information with their environment. Moreover, such metabolic processes are characteristic not only of physical or biological systems, but also of socio-economic, cultural, historical and humanitarian systems, since the processes occurring in them are, as a rule, irreversible.

The third law of thermodynamics (third law of thermodynamics) is associated with the concept of “absolute zero”. The physical meaning of this law, shown in the thermal theorem of W. Nernst (German physicist), is the fundamental impossibility of reaching absolute zero (-273.16ºС), at which the translational thermal movement of molecules should stop, and entropy will cease to depend on the parameters of the physical state of the system ( in particular, from changes in thermal energy). Nernst's theorem applies only to thermodynamically equilibrium states of systems.

In other words, Nernst’s theorem can be given the following formulation: when approaching absolute zero, the entropy incrementS tends to a well-defined final limit, independent of the values ​​​​taken by all parameters characterizing the state of the system(for example, on volume, pressure, state of aggregation, etc.).

Understand the essence of Nernst's theorem possible at the following example. As the temperature of the gas decreases, its condensation will occur and the entropy of the system will decrease, since the molecules are placed in a more orderly manner. With a further decrease in temperature, crystallization of the liquid will occur, accompanied by a greater orderliness in the arrangement of molecules and, consequently, an even greater decrease in entropy. At absolute zero temperature, all thermal motion ceases, disorder disappears, the number of possible microstates decreases to one, and entropy approaches zero.

4. The concept of self-organization. Self-organization in open systems.

Concept “ synergetics" was proposed in 1973 by the German physicist Hermann Haken to indicate direction, called explore the general laws of self-organization – the phenomenon of coordinated action of elements of a complex system without external control action. Synergetics (translated from Greek – joint, coordinated, facilitating) – scientific direction studying connections between structure elements(subsystems), which are formed in open systems (biological, physico-chemical, geological-geographical, etc.) thanks to intensive(streaming) exchange of matter, energy and information with the environment in non-equilibrium conditions. In such systems, coordinated behavior of subsystems is observed, as a result of which the degree of order increases (entropy decreases), that is, the process of self-organization develops.

Equilibriumthere is a state of peace and symmetry, A asymmetry leads to movement and non-equilibrium state .

Significant contribution to the theory of self-organization of systems contributed by a Belgian physicist of Russian origin I.R. Prigogine (1917-2003). He showed that in dissipative systems (systems in which entropy dissipation takes place) in the course of irreversible nonequilibrium processes, ordered formations arise, which he called dissipative structures.

Self-organization- This the process of spontaneous emergence of order and organization from disorder(chaos) in open nonequilibrium systems. Random deviations of system parameters from equilibrium ( fluctuations) play a very important role in the functioning and existence of the system. Due to growth of fluctuations when absorbing energy from the environment system reaches some critical condition And goes to a new steady state With more high level of complexity And order compared to the previous one. The system, self-organizing in a new stationary state, reduces its entropy; it, as it were, “dumps” its excess, which increases due to internal processes, into the environment.

Emerging from chaos ordered structure (attractor , or dissipative structure) is result of competition sets of all possible states embedded in the system. As a result of competition, there is a spontaneous selection of the most adaptive structure under the current conditions.

Synergetics is based on the thermodynamics of nonequilibrium processes, the theory of random processes, the theory of nonlinear oscillations and waves.

Synergetics considers the emergence and development of systems. Distinguish three types of systems: 1) closed, which do not exchange matter, energy, or information with neighboring systems (or with the environment); 2) closed that exchange energy, but not matter, with neighboring systems (for example, a spacecraft); 3) open, which exchange both matter and energy with neighboring systems. Almost all natural (ecological) systems are of the open type.

Existence of systems unthinkable no connections. The latter are divided into direct and inverse. Straight they call this connection , in which one element ( A) acts on another ( IN) without response. At feedback element IN responds to the element's action A. Feedback can be positive or negative.

Positive feedback leads to an intensification of the process in one direction. An example of its action is waterlogging of an area (for example, after deforestation). Process starts act V one direction: increased moisture – oxygen depletion – slower decomposition of plant residues – accumulation of peat – further increased waterlogging.

Feedback negative feedback acts in such a way that in response to increased action of the element A the opposite direction force of the element increases B. This connection allows the system to remain in the state stable dynamic equilibrium. This is the most common and important type of connections in natural systems. They are primarily the basis for the sustainability and stability of ecosystems.

An important property of systems is emergence (translated from English - emergence, appearance of something new). This property lies in the fact that the properties of the system as a whole are not a simple sum of the properties of its constituent parts or elements, but the interrelations of the various links of the system determine its new quality.

The synergetic approach to considering systems is based on three concepts: disequilibrium, openness And nonlinearity .

Disequilibrium(instability) state of the system, at which a change occurs in its macroscopic parameters, that is, composition, structure, behavior.

Openness –system ability constantly exchange matter, energy, information with the environment and have both “sources” - zones of replenishment of energy from the environment, and zones of dissipation, “sink”.

Nonlinearity –property of the system remain in different stationary states corresponding to different permissible laws of behavior of this system.

IN nonlinear systems development proceeds according to nonlinear laws, leading to a multivariate choice of paths and alternatives for exiting the state of instability. IN nonlinear systems processes can wear sharply threshold character when, with a gradual change in external conditions, an abrupt transition to another quality is observed. At the same time, old structures are destroyed, moving to qualitatively new structures.



Add your price to the database

A comment

Thermodynamics (Greek θέρμη - “heat”, δύναμις - “force”) is a branch of physics that studies the most general properties of macroscopic systems and methods of transfer and transformation of energy in such systems.

In thermodynamics, states and processes are studied, to describe which the concept of temperature can be introduced. Thermodynamics (T.) is a phenomenological science based on generalizations of experimental facts. The processes occurring in thermodynamic systems are described by macroscopic quantities (temperature, pressure, concentrations of components), which are introduced to describe systems consisting of a large number of particles and are not applicable to individual molecules and atoms, unlike, for example, the quantities introduced in mechanics or electrodynamics.

Modern phenomenological thermodynamics is a rigorous theory developed on the basis of several postulates. However, the connection of these postulates with the properties and laws of interaction of particles from which thermodynamic systems are built is given by statistical physics. Statistical physics also makes it possible to clarify the limits of applicability of thermodynamics.

The laws of thermodynamics are general in nature and do not depend on specific details of the structure of matter at the atomic level. Therefore, thermodynamics is successfully applied in a wide range of science and technology issues, such as energy, heat engineering, phase transitions, chemical reactions, transport phenomena and even black holes. Thermodynamics is important for a wide variety of areas of physics and chemistry, chemical technology, aerospace engineering, mechanical engineering, cell biology, biomedical engineering, materials science, and even finds its application in areas such as economics.

Important years in the history of thermodynamics

  • The origin of thermodynamics as a science is associated with the name of G. Galilei, who introduced the concept of temperature and designed the first device that responded to changes in ambient temperature (1597).
  • Soon G. D. Fahrenheit (1714), R. Reaumur (1730) and A. Celsius (1742) created temperature scales in accordance with this principle.
  • J. Black in 1757 already introduced the concepts of latent heat of fusion and heat capacity (1770). And Wilcke (J. Wilcke, 1772) introduced the definition of calorie as the amount of heat required to heat 1 g of water by 1 °C.
  • Lavoisier (A. Lavoisier) and Laplace (P. Laplace) in 1780 designed a calorimeter (see Calorimetry) and for the first time experimentally determined the specification. heat capacity of a number of substances.
  • In 1824, S. Carnot (N. L, S. Carnot) published a work devoted to the study of the principles of operation of heat engines.
  • B. Clapeyron introduced a graphical representation of thermodynamic processes and developed the method of infinitesimal cycles (1834).
  • G. Helmholtz noted the universal nature of the law of conservation of energy (1847). Subsequently, R. Clausius and W. Thomson (Kelvin; W. Thomson) systematically developed the theoretical apparatus of thermodynamics, which is based on the first law of thermodynamics and the second law of thermodynamics.
  • The development of the 2nd principle led Clausius to the definition of entropy (1854) and the formulation of the law of increasing entropy (1865).
  • Beginning with the work of J. W. Gibbs (1873), who proposed the method of thermodynamic potentials, the theory of thermodynamic equilibrium has been developed.
  • In the 2nd half. 19th century studies of real gases were carried out. A special role was played by the experiments of T. Andrews, who first discovered the critical point of the liquid-vapor system (1861), its existence was predicted by D. I. Mendeleev (1860).
  • By the end of the 19th century. great strides were made in obtaining low temperatures, as a result of which O2, N2 and H2 were liquefied.
  • In 1902, Gibbs published a work in which all the basic thermodynamic relations were obtained within the framework of statistical physics.
  • The connection between kinetic properties of the body and its thermodynamic. characteristics were established by L. Onsager (L. Onsager, 1931).
  • In the 20th century intensively studied the thermodynamics of solids, as well as quantum liquids and liquid crystals, in which diverse phase transitions take place.
  • L. D. Landau (1935-37) developed a general theory of phase transitions based on the concept of spontaneous symmetry breaking.

Sections of thermodynamics

Modern phenomenological thermodynamics is usually divided into equilibrium (or classical) thermodynamics, which studies equilibrium thermodynamic systems and processes in such systems, and nonequilibrium thermodynamics, which studies nonequilibrium processes in systems in which the deviation from thermodynamic equilibrium is relatively small and still allows for thermodynamic description.

Equilibrium (or classical) thermodynamics

In equilibrium thermodynamics, variables such as internal energy, temperature, entropy, and chemical potential are introduced. All of them are called thermodynamic parameters (quantities). Classical thermodynamics studies the relationships of thermodynamic parameters with each other and with physical quantities introduced into consideration in other branches of physics, for example, with the gravitational or electromagnetic field acting on the system. Chemical reactions and phase transitions are also included in the study of classical thermodynamics. However, the study of thermodynamic systems in which chemical transformations play a significant role is the subject of chemical thermodynamics, and thermal engineering deals with technical applications.

Classical thermodynamics includes the following sections:

  • principles of thermodynamics (sometimes also called laws or axioms)
  • equations of state and properties of simple thermodynamic systems (ideal gas, real gas, dielectrics and magnets, etc.)
  • equilibrium processes with simple systems, thermodynamic cycles
  • nonequilibrium processes and the law of non-decreasing entropy
  • thermodynamic phases and phase transitions

In addition, modern thermodynamics also includes the following areas:

  • a rigorous mathematical formulation of thermodynamics based on convex analysis
  • non-extensive thermodynamics

In systems that are not in a state of thermodynamic equilibrium, for example, in a moving gas, the local equilibrium approximation can be used, in which it is assumed that the equilibrium thermodynamic relations are satisfied locally at each point of the system.

Nonequilibrium thermodynamics

In nonequilibrium thermodynamics, variables are considered as local not only in space, but also in time, that is, time can enter its formulas explicitly. Let us note that Fourier’s classic work “Analytical Theory of Heat” (1822), devoted to the issues of thermal conductivity, was ahead of not only the emergence of nonequilibrium thermodynamics, but also Carnot’s work “Reflections on the driving force of fire and on machines capable of developing this force” (1824), which is generally considered a starting point in the history of classical thermodynamics.

Basic concepts of thermodynamics

Thermodynamic system- a body or group of bodies in interaction, mentally or actually isolated from the environment.

Homogeneous system– a system within which there are no surfaces separating parts of the system (phases) that differ in properties.

Heterogeneous system– a system within which there are surfaces separating parts of the system that differ in properties.

Phase– a set of homogeneous parts of a heterogeneous system, identical in physical and chemical properties, separated from other parts of the system by visible interfaces.

Isolated system- a system that does not exchange either matter or energy with the environment.

Closed system- a system that exchanges energy with the environment, but does not exchange matter.

Open system- a system that exchanges both matter and energy with the environment.

The totality of all physical and chemical properties of a system characterizes it thermodynamic state. All quantities characterizing any macroscopic property of the system under consideration are status parameters. It has been experimentally established that to unambiguously characterize a given system it is necessary to use a certain number of parameters called independent; all other parameters are considered as functions of independent parameters. Parameters that can be directly measured, such as temperature, pressure, concentration, etc., are usually chosen as independent state parameters. Any change in the thermodynamic state of a system (change in at least one state parameter) is thermodynamic process.

Reversible process- a process that allows the system to return to its original state without any changes remaining in the environment.

Equilibrium process– a process in which a system passes through a continuous series of equilibrium states.

Energy– a measure of the system’s ability to do work; a general qualitative measure of the movement and interaction of matter. Energy is an integral property of matter. A distinction is made between potential energy, caused by the position of a body in a field of certain forces, and kinetic energy, caused by a change in the position of the body in space.

Internal energy of the system– the sum of the kinetic and potential energy of all particles that make up the system. You can also define the internal energy of a system as its total energy minus the kinetic and potential energy of the system as a whole.

Forms of energy transition

Forms of energy transfer from one system to another can be divided into two groups.

  1. The first group includes only one form of transition of motion through chaotic collisions of molecules of two contacting bodies, i.e. by thermal conduction (and at the same time by radiation). The measure of the movement transmitted in this way is heat. Heat is a form of energy transfer through the disordered movement of molecules.
  2. The second group includes various forms of transition of motion, the common feature of which is the movement of masses covering very large numbers of molecules (i.e., macroscopic masses) under the influence of any forces. These are the lifting of bodies in a gravitational field, the transition of a certain amount of electricity from a higher electrostatic potential to a smaller one, the expansion of a gas under pressure, etc. The general measure of motion transmitted by such methods is work - a form of energy transfer through the ordered movement of particles.

Heat and work characterize qualitatively and quantitatively two different forms of transfer of motion from a given part of the material world to another. Heat and work cannot be contained in the body. Heat and work arise only when a process occurs, and characterize only the process. Under static conditions, heat and work do not exist. The difference between heat and work, accepted by thermodynamics as the starting point, and the opposition of heat to work makes sense only for bodies consisting of many molecules, because for one molecule or for a collection of few molecules, the concepts of heat and work lose their meaning. Therefore, thermodynamics considers only bodies consisting of a large number of molecules, i.e. so-called macroscopic systems.

Three principles of thermodynamics

The principles of thermodynamics are a set of postulates underlying thermodynamics. These provisions were established as a result of scientific research and were proven experimentally. They are accepted as postulates so that thermodynamics can be constructed axiomatically.

The need for the principles of thermodynamics is due to the fact that thermodynamics describes the macroscopic parameters of systems without specific assumptions regarding their microscopic structure. Statistical physics deals with issues of internal structure.

The principles of thermodynamics are independent, that is, none of them can be derived from the other principles. Analogues of Newton's three laws in mechanics are the three principles in thermodynamics, which connect the concepts of “heat” and “work”:

  • The zero law of thermodynamics speaks of thermodynamic equilibrium.
  • The first law of thermodynamics is about the conservation of energy.
  • The second law of thermodynamics is about heat flows.
  • The third law of thermodynamics is about the unattainability of absolute zero.

General (zero) law of thermodynamics

The general (zero) law of thermodynamics states that two bodies are in a state of thermal equilibrium if they can transfer heat to each other, but this does not happen.

It is not difficult to guess that two bodies do not transfer heat to each other if their temperatures are equal. For example, if you measure the temperature of a human body using a thermometer (at the end of the measurement, the temperature of the person and the temperature of the thermometer will be equal), and then, using the same thermometer, measure the temperature of the water in the bathroom, and it turns out that both temperatures coincide (there is thermal equilibrium between the person and thermometer and a thermometer with water), we can say that a person is in thermal equilibrium with the water in the bath.

From the above, we can formulate the zero law of thermodynamics as follows: two bodies that are in thermal equilibrium with a third are also in thermal equilibrium with each other.

From a physical point of view, the zero law of thermodynamics sets the reference point, since there is no heat flow between two bodies that have the same temperature. In other words, we can say that temperature is nothing more than an indicator of thermal equilibrium.

First law of thermodynamics

The first law of thermodynamics is the law of conservation of thermal energy, which states that energy does not disappear without leaving a trace.

The system can either absorb or release thermal energy Q, while the system performs work W on the surrounding bodies (or the surrounding bodies perform work on the system), and the internal energy of the system, which had the initial value Uninit, will be equal to Uend:

Uend-Ustart = ΔU = Q-W

Thermal energy, work and internal energy determine the total energy of the system, which is a constant value. If a certain amount of thermal energy Q is transferred to (taken away) from the system, in the absence of work, the amount of internal energy of the system U will increase (decrease) by Q.

Second law of thermodynamics

The second law of thermodynamics states that thermal energy can move only in one direction - from a body with a higher temperature to a body with a lower temperature, but not vice versa.

Third law of thermodynamics

The third law of thermodynamics states that any process consisting of a finite number of stages will not allow it to reach the temperature of absolute zero (although it can be significantly approached).

Page 1


The essence of the second law of thermodynamics is to a certain extent contained in the facts described in the two previous paragraphs. It is obvious that they are not based on abstract ideas or theoretical conclusions, but on the results of direct experience. The task is to generalize them and draw possibly far-reaching conclusions from such a generalization.  

The essence of the second law of thermodynamics lies in the fact that it formulates the conditions under which the transformation of energy into mechanical energy occurs. The second law of thermodynamics makes sense only in a limited area. All the conclusions of thermodynamics, as well as all its basic concepts (heat transfer, temperature), make sense only when considering a certain area of ​​phenomena.  

Briefly summarizing the essence of the second law of thermodynamics, we can say that uncompensated transfer of heat into work is impossible. From the impossibility of one process - the process of uncompensated transfer of heat into work - follows the impossibility of countless processes; All those processes are impossible, an integral part of which would be the uncompensated transfer of heat into work.  

As was clarified above, the essence of the second law of thermodynamics is that the number of equilibrium states is overwhelmingly large compared to the number of nonequilibrium distributions. However, for a universe consisting of an infinitely large number of particles, this statement loses its meaning. Indeed, both the number of equilibrium states and the number of nonequilibrium states become infinitely large.  

As was clarified above, the essence of the second law of thermodynamics is that the number of equilibrium states is overwhelmingly small compared to the number of nonequilibrium distributions. However, for a universe consisting of an infinitely large number of particles, this statement loses its meaning. Indeed, both the number of equilibrium states and the number of nonequilibrium states become infinitely large.  

It is known that from a pedagogical point of view, a strict presentation of the essence of the second law of thermodynamics and its immediate consequences is far from easy. These difficulties in presenting the second principle would not exist if the second principle determined, as is sometimes thought, the convertibility of one type of energy into another. In fact, the second principle in a certain way limits the transformation of one form of energy transfer - heat - into another form of energy transfer - work.  

A little later we will show that the idea of ​​entropy reflects the essence of the second law of thermodynamics, just as the idea of ​​internal energy reflects the essence of the first law.  

We will be guided further by the ideas about the two types of regularities discussed here in the study of all statistical physics, and also, in particular, in clarifying the essence of the second law of thermodynamics, which, as will be shown, is a statistical law. The relationship between statistical physics and ordinary thermodynamics is based on the acceptance of a statistical law.  

Carnot's work contributed to the establishment of a principle that made it possible to determine the highest possible efficiency of a heat engine. The essence of the second law of thermodynamics, according to Clausius, is that heat cannot by itself move from a colder body to a warmer one.  

Processes are reversible and irreversible. Briefly summarizing the essence of the second law of thermodynamics, we can say that uncompensated transfer of heat into work is impossible. Here compensation should be understood as a change in the thermodynamic state of a body or several bodies; in this case, the inevitable change in state (cooling) of the heat-releasing body is not taken into account.  

A complete understanding of the essence of the second law of thermodynamics and, at the same time, a solution to the problem of heat death came along the path of deep penetration into the essence of the concept of heat, along the path of clarifying the foundations and development of the molecular kinetic theory.  

So, if we wanted to take heat away from a colder body and transfer it to a hotter one, we would have to spend some additional energy on this. This position constitutes the essence of the second law of thermodynamics, which is formulated as follows: spontaneous transfer of heat from a colder body to a warmer body is impossible.  

The concept of so-called absolute temperature plays a particularly important role in thermodynamics. This concept is closely related to the essence of the second law of thermodynamics.  

Consequently, the equation for the element of heat is always (for any number of arguments) holonomic. If desired, we can assume that the essence of the second law of thermodynamics lies precisely in the fact that between the coefficients of the equation for the element of heat there is always a relationship that ensures the holonomy of this equation.  

Only after the research and reflection of Mayer, Joule and Helmholtz, who established the law of equivalence of heat and work, did the German physicist Rudolf Clausius (1822 - 1888) come to the second law of thermodynamics and formulate it mathematically. Clausius introduced entropy into consideration and showed that the essence of the second law of thermodynamics comes down to the inevitable growth of entropy in all real processes.  



Similar articles