Elements of representation theory quantum theory. Quantum theory

This apparently measurement-induced collapse of the wave function has been the source of many conceptual difficulties in quantum mechanics. Before the collapse, there is no way to tell for sure where the photon will end up; it can be anywhere with a non-zero probability. There is no way to trace the path of a photon from the source to the detector. The photon is unreal in the sense that a plane flying from San Francisco to New York is real.

Werner Heisenberg, among others, interpreted this mathematics to mean that reality does not exist until it is observed. “The idea of ​​an objective real world, the smallest particles of which exist objectively in the same sense that stones or trees exist, whether we observe them or not, is impossible,” he wrote. John Wheeler also used a variant of the double slit experiment to state that "no elementary quantum phenomenon is a phenomenon until it is a registered ('observable', 'certainly recorded') phenomenon."

But quantum theory gives absolutely no clue as to what counts as a "measurement." It simply postulates that the measuring device must be classical, without specifying where this line between classical and quantum lies, and leaving the door open for those who believe that the collapse causes human consciousness. Last May, Henry Stapp and his colleagues stated that the double slit experiment and its current variants suggest that "a conscious observer may be necessary" to make sense of the quantum realm, and that the material world is based on a transpersonal mind.

But these experiments are not empirical proof of such claims. In the double slit experiment performed with single photons, one can only test the probabilistic predictions of mathematics. If probabilities come up in the process of sending tens of thousands of identical photons through the double slit, the theory says that each photon's wavefunction has collapsed - thanks to a vaguely defined process called measurement. That's all.

In addition, there are other interpretations of the double slit experiment. Take, for example, the theory of de Broglie-Bohm, which states that reality is both a wave and a particle. A photon goes to a double slit with a certain position at any moment and passes through one slit or the other; therefore, each photon has a trajectory. It passes through the pilot wave, which enters through both slits, interferes, and then directs the photon to the site of constructive interference.

In 1979, Chris Dewdney and his colleagues at Brickbeck College, London, modeled this theory's prediction of the trajectories of particles that would pass through the double slit. Over the past decade, experimenters have confirmed that such trajectories exist, albeit using the controversial technique of so-called weak measurements. Although controversial, experiments have shown that the de Broglie-Bohm theory is still able to explain the behavior of the quantum world.

More importantly, this theory doesn't need observers, or measurements, or immaterial consciousness.

Neither are the so-called collapse theories, from which it follows that the wave functions collapse randomly: the greater the number of particles in a quantum system, the more likely the collapse is. Observers simply record the result. Markus Arndt's team at the University of Vienna in Austria tested these theories by sending larger and larger molecules through the double slit. Collapse theories predict that when matter particles become more massive than a certain threshold, they can no longer remain in quantum superposition and pass through both slits at the same time, and this destroys the interference pattern. Arndt's team sent an 800-atom molecule through a double slit and still saw the interference. The threshold search continues.

Roger Penrose had his own version of the collapse theory, in which the higher the mass of an object in superposition, the faster it collapses to one state or another due to gravitational instabilities. Again, this theory does not require an observer or any consciousness. Dirk Boumeester of the University of California at Santa Barbara tests Penrose's idea with a version of the double slit experiment.

Conceptually, the idea is not just to put a photon in a superposition of passing through two slits at the same time, but also to put one of the slits in a superposition and make it be in two places at the same time. According to Penrose, the replaced gap will either remain in superposition or collapse with the photon in flight, resulting in different interference patterns. This collapse will depend on the mass of the slots. Bowmeister has been working on this experiment for ten years and may soon confirm or refute Penrose's claims.

In any case, these experiments show that we cannot yet make any statements about the nature of reality, even if these statements are well supported mathematically or philosophically. And given that neuroscientists and philosophers of mind cannot agree on the nature of consciousness, the claim that it leads to wavefunction collapse is premature at best and misleading at worst.

And what is your opinion? tell in our

Physics is the most mysterious of all sciences. Physics gives us an understanding of the world around us. The laws of physics are absolute and apply to everyone without exception, regardless of person and social status.

This article is intended for persons over 18 years of age.

Are you over 18 already?

Fundamental discoveries in quantum physics

Isaac Newton, Nikola Tesla, Albert Einstein and many others are the great guides of mankind in the wonderful world of physics, who, like prophets, revealed to mankind the greatest secrets of the universe and the ability to control physical phenomena. Their bright heads cut through the darkness of ignorance of the unreasonable majority and, like a guiding star, showed the way to humanity in the darkness of the night. One of these conductors in the world of physics was Max Planck, the father of quantum physics.

Max Planck is not only the founder of quantum physics, but also the author of the world famous quantum theory. Quantum theory is the most important component of quantum physics. In simple terms, this theory describes the movement, behavior and interaction of microparticles. The founder of quantum physics also brought us many other scientific works that have become the cornerstones of modern physics:

  • theory of thermal radiation;
  • special theory of relativity;
  • research in the field of thermodynamics;
  • research in the field of optics.

The theory of quantum physics about the behavior and interaction of microparticles became the basis for condensed matter physics, elementary particle physics and high energy physics. Quantum theory explains to us the essence of many phenomena of our world - from the functioning of electronic computers to the structure and behavior of celestial bodies. Max Planck, the creator of this theory, thanks to his discovery allowed us to comprehend the true essence of many things at the level of elementary particles. But the creation of this theory is far from the only merit of the scientist. He was the first to discover the fundamental law of the universe - the law of conservation of energy. The contribution to science of Max Planck is difficult to overestimate. In short, his discoveries are priceless for physics, chemistry, history, methodology and philosophy.

quantum field theory

In a nutshell, quantum field theory is a theory of the description of microparticles, as well as their behavior in space, interaction with each other and mutual transformations. This theory studies the behavior of quantum systems within the so-called degrees of freedom. This beautiful and romantic name says nothing to many of us. For dummies, degrees of freedom are the number of independent coordinates that are needed to indicate the motion of a mechanical system. In simple terms, degrees of freedom are characteristics of motion. Interesting discoveries in the field of interaction of elementary particles were made by Steven Weinberg. He discovered the so-called neutral current - the principle of interaction between quarks and leptons, for which he received the Nobel Prize in 1979.

The Quantum Theory of Max Planck

In the nineties of the eighteenth century, the German physicist Max Planck took up the study of thermal radiation and eventually received a formula for the distribution of energy. The quantum hypothesis, which was born in the course of these studies, marked the beginning of quantum physics, as well as quantum field theory, discovered in the 1900th year. Planck's quantum theory is that during thermal radiation, the energy produced is emitted and absorbed not constantly, but episodically, quantumly. The year 1900, thanks to this discovery made by Max Planck, became the year of the birth of quantum mechanics. It is also worth mentioning Planck's formula. In short, its essence is as follows - it is based on the ratio of body temperature and its radiation.

Quantum-mechanical theory of the structure of the atom

The quantum mechanical theory of the structure of the atom is one of the basic theories of concepts in quantum physics, and indeed in physics in general. This theory allows us to understand the structure of everything material and opens the veil of secrecy over what things actually consist of. And the conclusions based on this theory are very unexpected. Consider the structure of the atom briefly. So what is an atom really made of? An atom consists of a nucleus and a cloud of electrons. The basis of the atom, its nucleus, contains almost the entire mass of the atom itself - more than 99 percent. The nucleus always has a positive charge, and it determines the chemical element of which the atom is a part. The most interesting thing about the nucleus of an atom is that it contains almost the entire mass of the atom, but at the same time it occupies only one ten-thousandth of its volume. What follows from this? And the conclusion is very unexpected. This means that the dense matter in the atom is only one ten-thousandth. And what about everything else? Everything else in the atom is an electron cloud.

The electron cloud is not a permanent and even, in fact, not a material substance. An electron cloud is just the probability of electrons appearing in an atom. That is, the nucleus occupies only one ten thousandth in the atom, and everything else is emptiness. And if we take into account that all the objects around us, from dust particles to celestial bodies, planets and stars, are made of atoms, it turns out that everything material is actually more than 99 percent of emptiness. This theory seems completely unbelievable, and its author, at least, a delusional person, because the things that exist around have a solid consistency, have weight and can be felt. How can it consist of emptiness? Has a mistake crept into this theory of the structure of matter? But there is no error here.

All material things appear dense only due to the interaction between atoms. Things have a solid and dense consistency only due to attraction or repulsion between atoms. This ensures the density and hardness of the crystal lattice of chemicals, of which everything material consists. But, an interesting point, when, for example, the temperature conditions of the environment change, the bonds between atoms, that is, their attraction and repulsion, can weaken, which leads to a weakening of the crystal lattice and even to its destruction. This explains the change in the physical properties of substances when heated. For example, when iron is heated, it becomes liquid and can be shaped into any shape. And when ice melts, the destruction of the crystal lattice leads to a change in the state of matter, and it turns from solid to liquid. These are clear examples of the weakening of bonds between atoms and, as a result, the weakening or destruction of the crystal lattice, and allow the substance to become amorphous. And the reason for such mysterious metamorphoses is precisely that substances consist of dense matter only by one ten-thousandth, and everything else is emptiness.

And substances seem to be solid only because of the strong bonds between atoms, with the weakening of which, the substance changes. Thus, the quantum theory of the structure of the atom allows us to take a completely different look at the world around us.

The founder of the theory of the atom, Niels Bohr, put forward an interesting concept that the electrons in an atom do not radiate energy constantly, but only at the moment of transition between the trajectories of their movement. Bohr's theory helped explain many intra-atomic processes, and also made a breakthrough in the science of chemistry, explaining the boundary of the table created by Mendeleev. According to , the last element that can exist in time and space has the serial number one hundred thirty-seven, and elements starting from one hundred and thirty-eighth cannot exist, since their existence contradicts the theory of relativity. Also, Bohr's theory explained the nature of such a physical phenomenon as atomic spectra.

These are the interaction spectra of free atoms that arise when energy is emitted between them. Such phenomena are typical for gaseous, vaporous substances and substances in the plasma state. Thus, quantum theory made a revolution in the world of physics and allowed scientists to advance not only in the field of this science, but also in the field of many related sciences: chemistry, thermodynamics, optics and philosophy. And also allowed humanity to penetrate the secrets of the nature of things.

There is still a lot to be done by humanity in its consciousness in order to realize the nature of atoms, to understand the principles of their behavior and interaction. Having understood this, we will be able to understand the nature of the world around us, because everything that surrounds us, starting with dust particles and ending with the sun itself, and we ourselves - everything consists of atoms, the nature of which is mysterious and amazing and fraught with a lot of secrets.

QUANTUM THEORY

QUANTUM THEORY

theory, the foundations of which were laid in 1900 by the physicist Max Planck. According to this theory, atoms always radiate or receive ray energy only in portions, discontinuously, namely, certain quanta (energy quanta), the energy value of which is equal to the oscillation frequency (light speed divided by the wavelength) of the corresponding type of radiation, multiplied by the Planck action (see . Constant, Microphysics . as well as Quantum mechanics). Quantum was put (Ch. O. Einstein) in the basis of the quantum theory of light (corpuscular theory of light), according to which light also consists of quanta moving at the speed of light (light quanta, photons).

Philosophical Encyclopedic Dictionary. 2010 .


See what "QUANTUM THEORY" is in other dictionaries:

    It has the following subsections (the list is incomplete): Quantum mechanics Algebraic quantum theory Quantum field theory Quantum electrodynamics Quantum chromodynamics Quantum thermodynamics Quantum gravity Superstring theory See also ... ... Wikipedia

    QUANTUM THEORY, a theory that, in combination with the theory of RELATIVITY, formed the basis for the development of physics throughout the entire 20th century. It describes the relationship between SUBSTANCE and ENERGY at the level of ELEMENTARY or subatomic PARTICLES, as well as ... ... Scientific and technical encyclopedic dictionary

    quantum theory- Another way of research is the study of the interaction of matter and radiation. The term "quantum" is associated with the name of M. Planck (1858 1947). This is the “black body” problem (an abstract mathematical concept for an object that accumulates all energy ... Western philosophy from its origins to the present day

    Combines quantum mechanics, quantum statistics and quantum field theory... Big Encyclopedic Dictionary

    Combines quantum mechanics, quantum statistics and quantum field theory. * * * QUANTUM THEORY QUANTUM THEORY combines quantum mechanics (see QUANTUM MECHANICS), quantum statistics (see QUANTUM STATISTICS) and quantum field theory ... ... encyclopedic Dictionary

    quantum theory- kvantinė teorija statusas T sritis fizika atitikmenys: angl. quantum theory vok. Quantentheorie, f rus. quantum theory, fpranc. theorie des quanta, f; theorie quantique, f … Fizikos terminų žodynas

    Phys. a theory that combines quantum mechanics, quantum statistics and quantum field theory. This is based on the idea of ​​a discrete (discontinuous) structure of radiation. According to K. t., any atomic system can be in certain, ... ... Natural science. encyclopedic Dictionary

    Quantum field theory is the quantum theory of systems with an infinite number of degrees of freedom (physical fields). Quantum mechanics, which arose as a generalization of quantum mechanics (See Quantum mechanics) in connection with the problem of description ... ... Great Soviet Encyclopedia

    - (KFT), relativistic quantum. theory of physics. systems with an infinite number of degrees of freedom. An example of such an email system. magn. field, for a complete description of the horn at any time, the assignment of electric strengths is required. and magn. fields at each point ... Physical Encyclopedia

    QUANTUM FIELD THEORY. Contents:1. Quantum fields .................. 3002. Free fields and wave-particle duality .................. 3013. Interaction fields.........3024. Perturbation theory .............. 3035. Divergences and ... ... Physical Encyclopedia

Books

  • Quantum theory
  • Quantum Theory, Bohm D. The book systematically presents non-relativistic quantum mechanics. The author analyzes in detail the physical content and examines in detail the mathematical apparatus of one of the most important ...
  • Quantum field theory Emergence and development Acquaintance with one of the most mathematical and abstract physical theories Issue 124 , Grigoriev V. Quantum theory is the most general and deepest of modern physical theories. About how the physical ideas about matter changed, how quantum mechanics arose, and then quantum mechanics ...

a) Background of quantum theory

At the end of the 19th century, the failure of attempts to create a theory of black-body radiation based on the laws of classical physics was revealed. It followed from the laws of classical physics that a substance must emit electromagnetic waves at any temperature, lose energy and lower the temperature to absolute zero. In other words. thermal equilibrium between matter and radiation was impossible. But this was at odds with everyday experience.

This can be explained in more detail as follows. There is the concept of a completely black body - a body that absorbs electromagnetic radiation of any wavelength. Its emission spectrum is determined by its temperature. There are no absolutely black bodies in nature. A completely black body most accurately corresponds to a closed opaque hollow body with a hole. Any piece of matter glows when heated, and with a further increase in temperature, it becomes first red, and then white. The color of the substance almost does not depend, for a completely black body it is determined solely by its temperature. Imagine such a closed cavity, which is maintained at a constant temperature and which contains material bodies capable of emitting and absorbing radiation. If the temperature of these bodies at the initial moment differed from the temperature of the cavity, then over time the system (cavity plus bodies) will tend to thermodynamic equilibrium, which is characterized by an equilibrium between the energy absorbed and measured per unit time. G. Kirchhoff established that this state of equilibrium is characterized by a certain spectral distribution of the energy density of the radiation contained in the cavity, and also that the function that determines the spectral distribution (the Kirchhoff function) depends on the temperature of the cavity and does not depend on either the size of the cavity or its shape , nor from the properties of the material bodies placed in it. Since the Kirchhoff function is universal, i.e. is the same for any black body, then the assumption arose that its form is determined by some provisions of thermodynamics and electrodynamics. However, such attempts proved to be untenable. It followed from D. Rayleigh's law that the spectral density of radiation energy should increase monotonously with increasing frequency, but the experiment testified otherwise: at first, the spectral density increased with increasing frequency, and then fell. Solving the problem of black body radiation required a fundamentally new approach. It was found by M. Planck.

Planck in 1900 formulated a postulate according to which a substance can emit radiation energy only in finite portions proportional to the frequency of this radiation (see the section "The Emergence of Atomic and Nuclear Physics"). This concept has led to a change in the traditional provisions underlying classical physics. The existence of a discrete action indicated the relationship between the localization of an object in space and time and its dynamic state. L. de Broglie emphasized that "from the point of view of classical physics, this connection seems completely inexplicable and much more incomprehensible in terms of the consequences to which it leads, than the connection between space variables and time, established by the theory of relativity." The quantum concept in the development of physics was destined to play a huge role.

The next step in the development of the quantum concept was the expansion of Planck's hypothesis by A. Einstein, which allowed him to explain the patterns of the photoelectric effect that did not fit into the framework of the classical theory. The essence of the photoelectric effect is the emission of fast electrons by a substance under the influence of electromagnetic radiation. The energy of the emitted electrons does not depend on the intensity of the absorbed radiation and is determined by its frequency and the properties of the given substance, but the number of emitted electrons depends on the intensity of the radiation. It was not possible to give an explanation of the mechanism of the released electrons, since, in accordance with the wave theory, a light wave, incident on an electron, continuously transfers energy to it, and its amount per unit time should be proportional to the intensity of the wave incident on it. Einstein in 1905 suggested that the photoelectric effect testifies to the discrete structure of light, i.e. that the radiated electromagnetic energy propagates and is absorbed like a particle (later called a photon). The intensity of the incident light is then determined by the number of light quanta falling on one square centimeter of the illuminated plane per second. Hence the number of photons that are emitted by a unit surface per unit time. should be proportional to the light intensity. Repeated experiments have confirmed this explanation of Einstein, not only with light, but also with x-rays and gamma rays. The A. Compton effect, discovered in 1923, gave new evidence for the existence of photons - elastic scattering of electromagnetic radiation of small wavelengths (X-ray and gamma radiation) on free electrons was discovered, which is accompanied by an increase in wavelength. According to the classical theory, the wavelength should not change during such scattering. The Compton effect confirmed the correctness of quantum ideas about electromagnetic radiation as a stream of photons - it can be considered as an elastic collision of a photon and an electron, in which the photon transfers part of its energy to the electron, and therefore its frequency decreases, and the wavelength increases.

There were other confirmations of the photon concept. The theory of the atom by N. Bohr (1913) turned out to be especially fruitful, revealing the connection between the structure of matter and the existence of quanta and establishing that the energy of intra-atomic motions can also change only abruptly. Thus, the recognition of the discrete nature of light took place. But in essence it was a revival of the previously rejected corpuscular concept of light. Therefore, problems arose quite naturally: how to combine the discreteness of the structure of light with the wave theory (especially since the wave theory of light was confirmed by a number of experiments), how to combine the existence of a light quantum with the phenomenon of interference, how to explain the phenomena of interference from the standpoint of the quantum concept? Thus, a need arose for a concept that would link the corpuscular and wave aspects of radiation.

b) The principle of conformity

To eliminate the difficulty that arose when using classical physics to justify the stability of atoms (recall that the loss of energy by an electron leads to its fall into the nucleus), Bohr assumed that an atom in a stationary state does not radiate (see the previous section). This meant that the electromagnetic theory of radiation was not suitable for describing electrons moving along stable orbits. But the quantum concept of the atom, having abandoned the electromagnetic concept, could not explain the properties of radiation. The task arose: to try to establish a certain correspondence between quantum phenomena and the equations of electrodynamics in order to understand why the classical electromagnetic theory gives a correct description of large-scale phenomena. In the classical theory, an electron moving in an atom emits continuously and simultaneously light of different frequencies. In quantum theory, on the contrary, an electron located inside an atom in a stationary orbit does not radiate - the radiation of a quantum occurs only at the moment of transition from one orbit to another, i.e. the emission of spectral lines of a certain element is a discrete process. Thus, there are two completely different views. Can they be harmonized, and if so, in what form?

It is obvious that correspondence with the classical picture is possible only if all spectral lines are emitted simultaneously. At the same time, it is obvious that from the quantum point of view, the emission of each quantum is an individual act, and therefore, in order to obtain the simultaneous emission of all spectral lines, it is necessary to consider a whole large ensemble of atoms of the same nature, in which various individual transitions occur, leading to the emission of various spectral lines of a particular element. . In this case, the concept of the intensity of the various lines of the spectrum must be represented statistically. To determine the intensity of individual radiation of a quantum, it is necessary to consider an ensemble of a large number of identical atoms. The electromagnetic theory makes it possible to give a description of macroscopic phenomena, and the quantum theory of those phenomena in which many quanta play an important role. Therefore, it is quite probable that the results obtained by the quantum theory will tend to be classical in the region of many quanta. The agreement between classical and quantum theories is to be sought in this area. To calculate the classical and quantum frequencies, it is necessary to find out whether these frequencies coincide for stationary states that correspond to large quantum numbers. Bohr suggested that for an approximate calculation of the real intensity and polarization one can use the classical estimates of intensities and polarizations, extrapolating to the region of small quantum numbers the correspondence that was established for large quantum numbers. This correspondence principle has been confirmed: the physical results of quantum theory at large quantum numbers should coincide with the results of classical mechanics, and relativistic mechanics at low speeds passes into classical mechanics. A generalized formulation of the correspondence principle can be expressed as the statement that a new theory that claims to have a wider range of applicability than the old one should include the latter as a special case. The use of the correspondence principle and giving it a more precise form contributed to the creation of quantum and wave mechanics.

By the end of the first half of the 20th century, two concepts emerged in the studies of the nature of light - wave and corpuscular, which remained unable to overcome the gap separating them. There was an urgent need to create a new concept, in which quantum ideas should form its basis, and not act as a kind of "appendage". The realization of this need was carried out by the creation of wave mechanics and quantum mechanics, which essentially constituted a single new quantum theory - the difference was in the mathematical languages ​​used. Quantum theory as a non-relativistic theory of the motion of microparticles was the deepest and broadest physical concept that explains the properties of macroscopic bodies. It was based on the idea of ​​Planck-Einstein-Bohr quantization and de Broglie's hypothesis about matter waves.

c) Wave mechanics

Its main ideas appeared in 1923-1924, when L. de Broglie expressed the idea that the electron must also have wave properties, inspired by the analogy with light. By this time, ideas about the discrete nature of radiation and the existence of photons had already become sufficiently strong, therefore, in order to fully describe the properties of radiation, it was necessary to alternately represent it either as a particle or as a wave. And since Einstein had already shown that the dualism of radiation is connected with the existence of quanta, it was natural to raise the question of the possibility of discovering such a dualism in the behavior of an electron (and in general of material particles). De Broglie's hypothesis about matter waves was confirmed by the phenomenon of electron diffraction discovered in 1927: it turned out that an electron beam gives a diffraction pattern. (Later, diffraction will also be found in molecules.)

Based on de Broglie's idea of ​​matter waves, E. Schrödinger in 1926 derived the basic equation of mechanics (which he called the wave equation), which makes it possible to determine the possible states of a quantum system and their change in time. The equation contained the so-called wave function y (psi-function) describing the wave (in the abstract configuration space). Schrödinger gave a general rule for converting these classical equations into wave equations, which refer to a multidimensional configuration space, and not to a real three-dimensional one. The psi-function determined the probability density of finding a particle at a given point. Within the framework of wave mechanics, an atom could be represented as a nucleus surrounded by a peculiar cloud of probability. Using the psi-function, the probability of the presence of an electron in a certain region of space is determined.

d) Quantum (matrix) mechanics.

Uncertainty principle

In 1926, W. Heisenberg develops his version of quantum theory in the form of matrix mechanics, starting from the correspondence principle. Faced with the fact that in the transition from the classical point of view to the quantum one it is necessary to decompose all physical quantities and reduce them to a set of individual elements corresponding to various possible transitions of a quantum atom, he came to represent each physical characteristic of a quantum system with a table of numbers (matrix) . At the same time, he was consciously guided by the goal of constructing a phenomenological concept in order to exclude from it everything that cannot be observed directly. In this case, there is no need to introduce into the theory the position, velocity or trajectory of the electrons in the atom, since we can neither measure nor observe these characteristics. Only those quantities that are associated with actually observed stationary states, transitions between them, and the radiation accompanying them should be introduced into the calculations. In the matrices, the elements were arranged in rows and columns, and each of them had two indices, one of which corresponded to the column number, and the other to the row number. Diagonal elements (i.e., elements whose indices coincide) describe a stationary state, and off-diagonal elements (elements with different indices) describe transitions from one stationary state to another. The value of these elements is associated with the values ​​characterizing the radiation during these transitions, obtained using the correspondence principle. It was in this way that Heisenberg built a matrix theory, all the quantities of which should describe only the observed phenomena. And although the presence in the apparatus of his theory of matrices representing the coordinates and momenta of electrons in atoms leaves doubts about the complete exclusion of unobservable quantities, Heisenbert managed to create a new quantum concept, which constituted a new step in the development of quantum theory, the essence of which is to replace the physical quantities that take place in atomic theory, matrices - tables of numbers. The results obtained by the methods used in wave and matrix mechanics turned out to be the same, so both concepts are included in the unified quantum theory as equivalent. The methods of matrix mechanics, due to their greater compactness, often lead to the desired results faster. The methods of wave mechanics are considered to be in better agreement with the way of thinking of physicists and their intuition. Most physicists use the wave method in their calculations and use wave functions.

Heisenberg formulated the uncertainty principle, according to which the coordinates and momentum cannot simultaneously take on exact values. To predict the position and speed of a particle, it is important to be able to accurately measure its position and speed. In this case, the more accurately the position of the particle (its coordinates) is measured, the less accurate the velocity measurements turn out to be.

Although light radiation consists of waves, however, in accordance with Planck's idea, light behaves like a particle, because its radiation and absorption are carried out in the form of quanta. The uncertainty principle, however, indicates that particles can behave like waves - they are, as it were, "smeared" in space, so we can talk not about their exact coordinates, but only about the probability of their detection in a certain space. Thus, quantum mechanics fixes corpuscular-wave dualism - in some cases it is more convenient to consider particles as waves, in others, on the contrary, waves as particles. Interference can be observed between two particle waves. If the crests and troughs of one wave coincide with the troughs of another wave, then they cancel each other out, and if the crests and troughs of one wave coincide with the crests and troughs of another wave, then they reinforce each other.

e) Interpretations of quantum theory.

Complementarity principle

The emergence and development of quantum theory led to a change in classical ideas about the structure of matter, motion, causality, space, time, the nature of cognition, etc., which contributed to a radical transformation of the picture of the world. The classical understanding of a material particle was characterized by its sharp separation from the environment, the possession of its own movement and location in space. In quantum theory, a particle began to be represented as a functional part of the system in which it is included, which does not have both coordinates and momentum. In the classical theory, motion was considered as the transfer of a particle, which remains identical to itself, along a certain trajectory. The dual nature of the motion of the particle necessitated the rejection of such a representation of the motion. Classical (dynamic) determinism has given way to probabilistic (statistical) determinism. If earlier the whole was understood as the sum of its constituent parts, then quantum theory revealed the dependence of the properties of a particle on the system in which it is included. The classical understanding of the cognitive process was associated with the knowledge of a material object as existing in itself. Quantum theory has demonstrated the dependence of knowledge about an object on research procedures. If the classical theory claimed to be complete, then the quantum theory from the very beginning unfolded as incomplete, based on a number of hypotheses, the meaning of which was far from clear at first, and therefore its main provisions received different interpretations, different interpretations.

Disagreements emerged primarily about the physical meaning of the duality of microparticles. De Broglie first put forward the concept of a pilot wave, according to which a wave and a particle coexist, the wave leads the particle. A real material formation that retains its stability is a particle, since it is precisely it that has energy and momentum. The wave carrying the particle controls the nature of the particle's motion. The amplitude of the wave at each point in space determines the probability of particle localization near this point. Schrödinger essentially solves the problem of the duality of a particle by removing it. For him, the particle acts as a purely wave formation. In other words, the particle is the place of the wave, in which the greatest energy of the wave is concentrated. The interpretations of de Broglie and Schrödinger were essentially attempts to create visual models in the spirit of classical physics. However, this turned out to be impossible.

Heisenberg proposed an interpretation of quantum theory, proceeding (as shown earlier) from the fact that physics should use only concepts and quantities based on measurements. Heisenberg therefore abandoned the visual representation of the motion of an electron in an atom. Macro devices cannot give a description of the motion of a particle with simultaneous fixation of the momentum and coordinates (i.e. in the classical sense) due to the fundamentally incomplete controllability of the interaction of the device with the particle - due to the uncertainty relation, the measurement of the momentum does not make it possible to determine the coordinates and vice versa. In other words, due to the fundamental inaccuracy of measurements, the predictions of the theory can only be of a probabilistic nature, and the probability is a consequence of the fundamental incompleteness of information about the motion of a particle. This circumstance led to the conclusion about the collapse of the principle of causality in the classical sense, which assumed the prediction of exact values ​​of momentum and position. In the framework of quantum theory, therefore, we are not talking about errors in observation or experiment, but about a fundamental lack of knowledge, which is expressed using a probability function.

Heisenberg's interpretation of quantum theory was developed by Bohr and was called the Copenhagen interpretation. Within the framework of this interpretation, the main provision of quantum theory is the principle of complementarity, which means the requirement to use mutually exclusive classes of concepts, devices and research procedures that are used in their specific conditions and complement each other in order to obtain a holistic picture of the object under study in the process of cognition. This principle is reminiscent of the Heisenberg uncertainty relation. If we are talking about the definition of momentum and coordinate as mutually exclusive and complementary research procedures, then there are grounds for identifying these principles. However, the meaning of the complementarity principle is wider than the uncertainty relations. In order to explain the stability of the atom, Bohr combined classical and quantum ideas about the motion of an electron in one model. The principle of complementarity, thus, allowed classical representations to be supplemented with quantum ones. Having revealed the opposite of the wave and corpuscular properties of light and not finding their unity, Bohr leaned towards the idea of ​​two, equivalent to each other, methods of description - wave and corpuscular - with their subsequent combination. So it is more accurate to say that the principle of complementarity is the development of the uncertainty relation, expressing the relationship of coordinate and momentum.

A number of scientists have interpreted the violation of the principle of classical determinism within the framework of quantum theory in favor of indeternism. In fact, here the principle of determinism changed its form. Within the framework of classical physics, if at the initial moment of time the positions and state of motion of the elements of the system are known, it is possible to completely predict its position at any future moment of time. All macroscopic systems were subject to this principle. Even in those cases when it was necessary to introduce probabilities, it was always assumed that all elementary processes are strictly deterministic and that only their large number and disorderly behavior makes one resort to statistical methods. In quantum theory, the situation is fundamentally different. To implement the principles of deternization, here it is necessary to know the coordinates and momenta, and this is prohibited by the uncertainty relation. The use of probability here has a different meaning compared to statistical mechanics: if in statistical mechanics probabilities were used to describe large-scale phenomena, then in quantum theory, probabilities, on the contrary, are introduced to describe the elementary processes themselves. All this means that in the world of large-scale bodies the dynamic principle of causality operates, and in the microcosm - the probabilistic principle of causality.

The Copenhagen interpretation presupposes, on the one hand, the description of experiments in terms of classical physics, and, on the other hand, the recognition of these concepts as inaccurately corresponding to the actual state of affairs. It is this inconsistency that determines the likelihood of quantum theory. The concepts of classical physics form an important part of the natural language. If we do not use these concepts to describe our experiments, we will not be able to understand each other.

The ideal of classical physics is the complete objectivity of knowledge. But in cognition we use instruments, and thus, as Heinzerberg says, a subjective element is introduced into the description of atomic processes, since the instrument is created by the observer. "We must remember that what we observe is not nature itself, but nature that appears as it appears through our way of asking questions. Scientific work in physics consists in asking questions about nature on the language we use and try to get an answer in an experiment carried out with the means at our disposal.This brings to mind Bohr's words about quantum theory: if we are looking for harmony in life, we must never forget that in the game of life we ​​are both spectators and participants. It is clear that in our scientific attitude to nature, our own activity becomes important where we have to deal with areas of nature that can only be penetrated through the most important technical means "

Classical representations of space and time also proved impossible to use to describe atomic phenomena. Here is what another creator of quantum theory wrote about this: “The existence of an action quantum revealed a completely unforeseen connection between geometry and dynamics: it turns out that the possibility of localizing physical processes in geometric space depends on their dynamic state. The general theory of relativity has already taught us to consider the local properties of space -time depending on the distribution of matter in the universe.However, the existence of quanta requires a much deeper transformation and no longer allows us to represent the movement of a physical object along a certain line in space-time (the world line).Now it is impossible to determine the state of motion, based on the curve depicting successive positions of an object in space over time.Now we need to consider the dynamic state not as a consequence of spatiotemporal localization, but as an independent and additional aspect of physical reality"

Discussions on the problem of interpretation of quantum theory have exposed the question of the very status of quantum theory - whether it is a complete theory of the motion of a microparticle. The question was first formulated in this way by Einstein. His position was expressed in the concept of hidden parameters. Einstein proceeded from the understanding of quantum theory as a statistical theory that describes the patterns related to the behavior of not a single particle, but their ensemble. Each particle is always strictly localized and simultaneously has certain values ​​of momentum and position. The uncertainty relation does not reflect the real structure of reality at the level of microprocesses, but the incompleteness of quantum theory - it’s just that at its level we are not able to simultaneously measure momentum and coordinate, although they actually exist, but as hidden parameters (hidden within the framework of quantum theory). Einstein considered the description of the state of a particle with the help of the wave function to be incomplete, and therefore he presented the quantum theory as an incomplete theory of the motion of a microparticle.

Bohr took the opposite position in this discussion, proceeding from the recognition of the objective uncertainty of the dynamic parameters of a microparticle as the reason for the statistical nature of quantum theory. In his opinion, Einstein's denial of the existence of objectively uncertain quantities leaves unexplained the wave features inherent in a microparticle. A return to the classical concepts of microparticle motion was considered impossible by Bohr.

In the 50s. In the 20th century, D.Bohm returned to de Broglie's concept of a wave-pilot, presenting a psi-wave as a real field associated with a particle. Supporters of the Copenhagen interpretation of quantum theory and even some of its opponents did not support Bohm's position, however, it contributed to a more in-depth study of de Broglie's concept: the particle began to be considered as a special formation that arises and moves in the psi-field, but retains its individuality. The works of P. Vigier, L. Yanoshi, who developed this concept, were evaluated by many physicists as too "classical".

In Russian philosophical literature of the Soviet period, the Copenhagen interpretation of quantum theory was criticized for "adherence to positivist attitudes" in the interpretation of the process of cognition. However, a number of authors defended the validity of the Copenhagen interpretation of quantum theory. The replacement of the classical ideal of scientific cognition with a non-classical one was accompanied by the understanding that the observer, trying to build a picture of an object, cannot be distracted from the measurement procedure, i.e. the researcher is unable to measure the parameters of the object under study as they were before the measurement procedure. W. Heisenberg, E. Schrödinger and P. Dirac put the principle of uncertainty at the basis of quantum theory, in which particles no longer had definite and mutually independent momentum and coordinates. Quantum theory thus introduced an element of unpredictability and randomness into science. And although Einstein could not agree with this, quantum mechanics was consistent with experiment, and therefore became the basis of many areas of knowledge.

f) Quantum statistics

Simultaneously with the development of wave and quantum mechanics, another component of quantum theory developed - quantum statistics or statistical physics of quantum systems consisting of a large number of particles. On the basis of the classical laws of motion of individual particles, a theory of the behavior of their aggregate was created - classical statistics. Similarly, based on the quantum laws of particle motion, quantum statistics was created that describes the behavior of macroobjects in cases where the laws of classical mechanics are not applicable to describe the motion of their constituent microparticles - in this case, quantum properties appear in the properties of macroobjects. It is important to keep in mind that the system in this case is understood only as particles interacting with each other. In this case, a quantum system cannot be considered as a collection of particles that retain their individuality. In other words, quantum statistics requires the rejection of the representation of the distinguishability of particles - this is called the principle of identity. In atomic physics, two particles of the same nature were considered identical. However, this identity was not recognized as absolute. Thus, two particles of the same nature could be distinguished at least mentally.

In quantum statistics, the ability to distinguish between two particles of the same nature is completely absent. Quantum statistics proceeds from the fact that two states of a system, which differ from each other only by a permutation of two particles of the same nature, are identical and indistinguishable. Thus, the main position of quantum statistics is the principle of identity of identical particles included in a quantum system. This is where quantum systems differ from classical systems.

In the interaction of a microparticle, an important role belongs to the spin - the intrinsic moment of momentum of the microparticle. (In 1925, D. Uhlenbeck and S. Goudsmit first discovered the existence of an electron spin). The spin of electrons, protons, neutrons, neutrinos, and other particles is expressed as a half-integer value; for photons and pi-mesons, as an integer value (1 or 0). Depending on the spin, the microparticle obeys one of two different types of statistics. Systems of identical particles with integer spin (bosons) obey Bose-Einstein quantum statistics, the characteristic feature of which is that an arbitrary number of particles can be in each quantum state. This type of statistics was proposed in 1924 by S. Bose and then improved by Einstein). In 1925, for particles with half-integer spin (fermions), E. Fermi and P. Dirac (independently of each other) proposed another type of quantum statics, which was named Fermi-Dirac. A characteristic feature of this type of statics is that an arbitrary number of particles can be in each quantum state. This requirement is called the W. Pauli exclusion principle, which was discovered in 1925. The statistics of the first type is confirmed in the study of such objects as an absolutely black body, the second type - electron gas in metals, nucleons in atomic nuclei, etc.

The Pauli principle made it possible to explain the regularities in the filling of shells by electrons in multielectron atoms, to give a rationale for Mendeleev's periodic system of elements. This principle expresses a specific property of the particles that obey it. And now it is difficult to understand why two identical particles mutually forbid each other to occupy the same state. This type of interaction does not exist in classical mechanics. What is its physical nature, what are the physical sources of the prohibition - a problem waiting to be resolved. One thing is clear today: a physical interpretation of the exclusion principle within the framework of classical physics is impossible.

An important conclusion of quantum statistics is the proposition that a particle included in any system is not identical to the same particle, but included in a system of a different type or free. This implies the importance of the task of identifying the specifics of the material carrier of a certain property of systems.

g) Quantum field theory

Quantum field theory is an extension of quantum principles to the description of physical fields in their interactions and mutual transformations. Quantum mechanics deals with the description of relatively low-energy interactions in which the number of interacting particles is conserved. At high interaction energies of the simplest particles (electrons, protons, etc.), their interconversion occurs, i.e. some particles disappear, others are born, and their number changes. Most elementary particles are unstable, spontaneously decay until stable particles are formed - protons, electrons, photons and neutrons. In collisions of elementary particles, if the energy of the interacting particles is large enough, there is a multiple production of particles of different spectra. Since quantum field theory is intended to describe processes at high energies, it must therefore satisfy the requirements of the theory of relativity.

Modern quantum field theory includes three types of interaction of elementary particles: weak interactions, which mainly determine the decay of unstable particles, strong and electromagnetic, responsible for the transformation of particles during their collision.

Quantum field theory, which describes the transformation of elementary particles, unlike quantum mechanics, which describes their motion, is not consistent and complete, it is full of difficulties and contradictions. The most radical way to overcome them is the creation of a unified field theory, which should be based on a unified law of interaction of primary matter - the spectrum of masses and spins of all elementary particles, as well as the values ​​of particle charges, should be derived from the general equation. Thus, it can be said that quantum field theory sets the task of developing a deeper understanding of the elementary particle that arises due to the field of a system of other elementary particles.

The interaction of an electromagnetic field with charged particles (mainly electrons, positrons, muons) is studied by quantum electrodynamics, which is based on the concept of the discreteness of electromagnetic radiation. The electromagnetic field consists of photons with corpuscular-wave properties. The interaction of electromagnetic radiation with charged particles is considered by quantum electrodynamics as the absorption and emission of photons by particles. A particle can emit photons and then absorb them.

So, the departure of quantum physics from classical is to abandon the description of individual events occurring in space and time, and to use the statistical method with its probability waves. The goal of classical physics is to describe objects in space and time and to form the laws that govern the change of these objects in time. Quantum physics, dealing with radioactive decay, diffraction, emission of spectral lines, and the like, cannot be satisfied with the classical approach. A judgment like "such and such an object has such and such a property", which is characteristic of classical mechanics, is replaced in quantum physics by a judgment like "such and such an object has such and such a property with such and such a degree of probability." Thus, in quantum physics there are laws that govern changes in probability over time, while in classical physics we are dealing with laws that govern changes in an individual object over time. Different realities obey different laws.

Quantum physics occupies a special place in the development of physical ideas and the style of thinking in general. Among the greatest creations of the human mind is undoubtedly the theory of relativity - special and general, which is a new system of ideas that united mechanics, electrodynamics and the theory of gravity and gave a new understanding of space and time. But it was a theory which, in a certain sense, was the completion and synthesis of nineteenth-century physics, i.e. it did not mean a complete break with classical theories. Quantum theory, on the other hand, broke with classical traditions, it created a new language and a new style of thinking that allows one to penetrate into the microcosm with its discrete energy states and describe it by introducing characteristics that were absent in classical physics, which ultimately made it possible to understand the essence of atomic processes. But at the same time, quantum theory introduced an element of unpredictability and randomness into science, which is how it differed from classical science.

QUANTUM FIELD THEORY.

1. Quantum fields................... 300

2. Free fields and wave-particle duality .............................. 301

3. Interaction of fields.........302

4. Perturbation theory .......... 303

5. Divergences and renormalizations......... 304

6. UV asymptotics and the renormalization group .......... 304

7. Calibration fields............... 305

8. The big picture ........... 307

9. Prospects and problems............. 307

quantum field theory(QFT) - quantum theory of relativistic systems with an infinitely large number of degrees of freedom (relativistic fields), which is theoretical. the basis for describing microparticles, their interactions and transformations.

1. Quantum fields Quantum (otherwise - quantized) field is a kind of synthesis of the concepts of classical. fields of the electromagnetic type and the field of probabilities of quantum mechanics. According to modern According to notions, the quantum field is the most fundamental and universal form of matter underlying all its concrete manifestations. The idea of ​​a classic field arose in the depths of the theory of electromagnetism Faraday - Maxwell and finally crystallized in the process of creating a special. theory of relativity, which required the abandonment of ether as a material carrier of e-magn. processes. At the same time, the field had to be considered not a form of motion to-l. environment, but specific. a form of matter with very unusual properties. Unlike particles, the classical the field is continuously created and destroyed (is emitted and absorbed by charges), has an infinite number of degrees of freedom and is not localized in a certain. points of space-time, but can propagate in it, transmitting a signal (interaction) from one particle to another with a finite speed not exceeding With. The emergence of quantum ideas led to a revision of the classical. ideas about the continuity of the mechanism of emission n and to the conclusion that these processes occur discretely - by emission and absorption of quanta e-magn. fields - photons. Arose contradictory from the point of view of the classical. physics picture when with e-magn. photons were compared with the field and some phenomena could be interpreted only in terms of waves, while others - only with the help of the concept of quanta, called wave-particle duality. This contradiction was resolved in the following. application of the ideas of quantum mechanics to the field. Dynamic variable el-magn. fields - potentials BUT , j and electric strength. and magn. fields E , H - have become quantum operators, subject to def. permutation relations and acting on the wave function (amplitude, or state vector) systems. Thus, a new physical object - a quantum field that satisfies the equations of the classical. , but having its own quantum mechanical values. operators. The second source of the general concept of a quantum field was the wave function of a particle y ( x, t), which is not an independent physical entity. magnitude, and the amplitude of the state of the particle: the probability of any related to the particle physical. quantities are expressed in terms of expressions that are bilinear in y. Thus, in quantum mechanics, a new field, the field of probability amplitudes, turned out to be associated with each material particle. The relativistic generalization of the y-function led P. A. M. Dirac (R. A. M. Dirac) to a four-component wave function of the electron y a (a=1, 2, 3, 4), which is transformed according to the spinor representation Lorenz group. It was soon realized that in general each department. a relativistic microparticle should be associated with a local field that implements a certain representation of the Lorentz group and has a physical. the meaning of the probability amplitude. Generalization for the case of many particles showed that if they satisfy the principle of indistinguishability ( identity principle), then to describe all particles, one field in four-dimensional space-time is sufficient, which is an operator in the sense of . This is achieved by the transition to a new quantum mechanics. representation - representation of the fill numbers (or representation of the secondary quantization). The operator field introduced in this way turns out to be completely analogous to the quantized el-magn. field, differing from it only in the choice of the representation of the Lorentz group and, possibly, in the method of quantization. Like e-mag. field, one such field corresponds to the entire set of identical particles of a given type, for example, one operator Dirac field describes all the electrons (and positrons!) of the Universe. Thus, a universal picture of the uniform structure of all matter arises. To replace the fields and particles of the classical. physicists come unified nat. objects are quantum fields in four-dimensional space-time, one for each kind of particle or (classical) field. An elementary act of any interaction becomes the interaction of several. fields at one point in space-time, or - in corpuscular language - local and instantaneous transformation of some particles into others. Classic the interaction in the form of forces acting between particles turns out to be a secondary effect resulting from the exchange of quanta of the field that transfers the interaction.
2. Free fields and wave-particle duality In accordance with the general physical outlined above. picture in a systematic The presentation of QFT can be started from both field and corpuscular representations. In the field approach, one must first construct a theory of the corresponding classical field, then subject it to quantization [similar to the quantization of e-mag. fields by W. Heisenberg and W. Pauli] and, finally, develop a corpuscular interpretation for the resulting quantized field. The main initial concept here will be the field and a(X) (index a enumerates the components of the field) defined at each space-time point x=(ct,x) and carrying out to-l. a fairly simple representation of the Lorentz group. The further theory is constructed most simply with the help of Lagrangian formalism; choose a local [i.e. e. depending only on the field components and a(X) and their first derivatives d m and a(X)=du a /dx m = and a m ( X) (m=0, 1, 2, 3) at one point X] quadratic Poincaré-invariant (see Poincaré group) Lagrangian L(x) = L(u a , q m u b) and from least action principle get the equations of motion. For a quadratic Lagrangian, they are linear - free fields satisfy the superposition principle. By virtue of Noether theorem from the invariance of the action S with respect to each one-parameter. group follows the conservation (independence of time) of one, explicitly indicated by the theorem, the integral function of and a and d m u b. Since the Poincaré group itself is 10-parametric, QFT necessarily retains 10 quantities, which are sometimes called fundams. dynamic quantities: from the invariance with respect to four shifts in the four-dimensional space-time follows the conservation of the four components of the energy-momentum vector R m M i = 1/2 E ijk M jk and three so-called. boosts N i =c - l M 0i(i, j, k= 1, 2, 3, E ijk- single fully antisymmetric tensor; doubly occurring indices imply summation). With mother. point of view ten pounds. values ​​- R m , M i , N i- essence group generators Poincare. If the action remains invariant even when some other continuous transformations, not included in the Poincaré group, are performed on the field under consideration - transformations of the ext. symmetry, - from the Noether theorem then the existence of new conserved dynamical. quantities. Thus, it is often assumed that the field functions are complex, and the condition of being Hermitian is imposed on the Lagrangian (cf. Hermitian operator) and require the invariance of the action with respect to the global gauge transformation(phase a does not depend on X) and a(X)""e i a and a(X), u* a(X)""e - i a u* a(X). Then it turns out (as a consequence of Noether's theorem) that the charge is conserved

Therefore, complex functions and a can be used to describe the charge. fields. The same goal can be achieved by expanding the range of values ​​traversed by the indices a, so that they also indicate the direction in the isotopic. space, and requiring the action to be invariant under rotations in it. Note that the charge Q is not necessarily electric. charge, it can be any conserved characteristic of the field not related to the Poincaré group, for example, lepton number, strangeness, baryon number etc. Canonical quantization, according to the general principles of quantum mechanics, is that the generalized coordinates [i.e. e. (infinite) set of values ​​of all field components u 1 , . . ., u N at all points x space at some point in time t(in a more sophisticated presentation - at all points of some spacelike hypersurface s] and the generalized momenta p b(x, t)=dL/du b(x, t) are declared as operators acting on the amplitude of the state (state vector) of the system, and commutation relations are imposed on them:

moreover, the signs "+" or "-" correspond to Fermi - Dirac or Bose - Einstein quantization (see below). Here d ab - Kronecker symbol,d( x-y) - delta function Dirac. Due to the distinguished role of time and the inevitable recourse to a specific frame of reference, permutation relations (1) violate the explicit symmetry of space and time, and the preservation of relativistic invariance requires special. proof of. In addition, relations (1) do not say anything about commutation. properties of fields in timelike pairs of space-time points - the values ​​of the fields at such points are causally dependent, and their permutations can only be determined by solving the equations of motion together with (1). For free fields, for which the equations of motion are linear, such a problem is solvable in a general form and allows one to establish - and, moreover, in a relativistically symmetric form - the permutation relations of the fields at two arbitrary points X and at.

Here D t - permutation function Pauli - Jordan Satisfying Klein - Gordon equation P ab- a polynomial that ensures satisfaction of the right side (2) of the equations of motion along X and by at, - D-Alamber operator, t is the mass of the field quantum (hereinafter, the system of units h= With= 1). In the corpuscular approach to the relativistic quantum description of free particles, the particle state vectors must form an irreducible representation of the Poincaré group. The latter is fixed by setting the values ​​of the Casimir operators (operators commuting with all ten generators of the group R m M i and N i), which the Poincaré group has two. The first is the squared mass operator m 2 =R m R m . At m 2 No. 0, the second Casimir operator is the square of the ordinary (three-dimensional) spin, and at zero mass, the helicity operator (the projection of the spin onto the direction of motion). Spectrum m 2 is continuous - the square of the mass can have any non-negative. values, m twenty; the spin spectrum is discrete, it can have integer or half-integer values: 0, 1 / 2 , 1, ... In addition, it is necessary to specify the behavior of the state vector when reflecting an odd number of coordinate axes. If no other characteristics are required, the particle is said to have no intrinsic value. degrees of freedom and called. true neutral particle. Otherwise, the particle has charges of one kind or another. In order to fix the state of a particle inside a representation, in quantum mechanics it is necessary to set the values ​​of the complete set of commuting operators. The choice of such a set is ambiguous; for a free particle it is convenient to take three components of its momentum R and projection s back l s on to-l. direction. Thus, the state of one free truly neutral particle is completely characterized by the given numbers t, l s , p x, p y , p z , s, the first two of which define the view, and the next four - the state in it. For charging. particles will be added others; let's denote them by the letter t. In the representation of occupation numbers, the state of a collection of identical particles is fixed filling numbers n p,s, t of all one-particle states (indices characterizing the representation, as a whole, are not written out). In turn, the state vector | np,s, t > is written as the result of the action on the vacuum state |0> (i.e., the state in which there are no particles at all) of the creation operators a + (p, s, t):

Birth Operators a+ and its Hermitian conjugate annihilation operators a - satisfy the permutation relations

where the signs "+" and "-" correspond respectively to Fermi - Dirac and Bose - Einstein quantization, and the occupation numbers are proper. values ​​of operators for the number of particles T. o., the state vector of a system containing one particle each with quantum numbers p 1 , s 1 , t 1 ; p 2 , s 2, t2; . . ., is written as

To take into account the local properties of the theory, it is necessary to translate the operators a b into a coordinate representation. As a transformation function, it is convenient to use the classic. solution of equations of motion of a suitable free field with tensor (or spinor) indices a and index internal symmetry q. Then the operators of creation and destruction in the coordinate representation will be:


These operators, however, are still unsuitable for constructing a local QFT: both their commutator and anticommutator are proportional to the non-Pauli-Jordan functions D t, and its positive and negative frequency parts D 6 m(x-y)[Dm =D + m +D - m], which for spacelike pairs of points X and at do not vanish. To obtain a local field, it is necessary to construct a superposition of the creation and annihilation operators (5). For truly neutral particles this can be done directly by defining the local Lorentz covariant field as
u a(x)=u a(+ ) (X) + and a(-) (X). (6)
But for charging. particles cannot do this: the operators a + t and a- t in (6) will increase one, and the other will decrease the charge, and their linear combination will not have a definite in this regard. properties. Therefore, to form a local field, one has to pair with the creation operators a + t are the annihilation operators, not of the same particles, but of new particles (marked with a tilde on top), realizing the same representation of the Poincaré group, i.e., having exactly the same mass and spin, but differing from the original ones by the sign of the charge (the signs of all charges t), and write:

From Pauli theorems it now follows that for fields of integer spin, the field functions of which perform a unique representation of the Lorentz group, when quantized according to Bose - Einstein commutators [ and(X), and(at)]_ or [ and(X), v*(at)]_ proportional functions D m(x-y) and disappear outside the light cone, while for two-valued representations of fields of half-integer spin the same is achieved for anticommutators [ and(X), and(at)] + (or [ v(x), v* (y)] +) in Fermi±Dirac quantization. Expressed by f-lams (6) or (7) the connection between the Lorentz-covariant functions of the field satisfying linear equations and or v, v* and operators of creation and annihilation of free particles in stationary quantum mechanics. states is an exact mat. description of corpuscular-wave dualism. New particles "born" by operators, without which it was impossible to construct local fields (7), called - in relation to the original - antiparticles. The inevitability of the existence of an antiparticle for each charge. particles - one of Ch. conclusions of the quantum theory of free fields.
3. Interaction of fields Solutions (6) and (7) ur-tion of the free field of proportions. operators of the creation and annihilation of particles in stationary states, i.e., they can describe only such situations when nothing happens to the particles. To also consider cases where some particles affect the motion of others or turn into others, it is necessary to make the equations of motion nonlinear, i.e., to include in the Lagrangian, in addition to terms quadratic in fields, also terms with higher degrees. From the point of view of the theory developed so far, such interaction Lagrangians L int could be any functions of fields and their first derivatives, satisfying only a number of simple conditions: 1) the locality of the interaction, requiring that L int(x) depended on diff. fields and a(X) and their first derivatives only at one point in space-time X; 2) relativistic invariance, in order to fulfill a cut L int must be a scalar with respect to Lorentz transformations; 3) invariance under transformations from internal symmetry groups, if any, for the model under consideration. For theories with complex fields, this includes, in particular, the requirements that the Lagrangian be Hermitian and invariant under gauge transformations admissible in such theories. In addition, one can require that the theory be invariant under certain discrete transformations, such as spatial inversion P, time reversal T and charge conjugation C(replacing particles with antiparticles). Proven ( CPT theorem) that any interaction that satisfies conditions 1)-3) must necessarily be invariant with respect to the same time. performing these three discrete transformations. The variety of interaction Lagrangians satisfying conditions 1)-3) is as wide as, for example, the variety of Lagrange functions in the classical mechanics, and on the definition. At the stage of development of QFT, it seemed that the theory did not answer the question of why some of them, and not others, are realized in nature. However, after the idea renormalizations UV divergences (see Section 5 below) and its brilliant implementation in quantum electrodynamics(QED) a predominant class of interactions - renormalizable - was singled out. Condition 4) - renormalizability turns out to be very restrictive, and its addition to conditions 1)-3) leaves only interactions with L int the form of polynomials of low degree in the fields under consideration, and fields of any high spins are generally excluded from consideration. Thus, interaction in a renormalizable QFT does not allow - in striking contrast to the classical. and quantum mechanics - no arbitrary functions: as soon as a specific set of fields is chosen, the arbitrariness in L int limited to a fixed number interaction constants(coupling constants). The complete system of equations of QFT with interaction (in Heisenberg representation) constitute the equations of motion obtained from the full Lagrangian (a connected system of differential equations in partial derivatives with nonlinear terms of interaction and self-action) and canonical. permutation relations (1). The exact solution of such a problem can be found only in a small number of physically low-content. cases (for example, for certain models in two-dimensional space-time). On the other hand, canonical the permutation relations violate, as already mentioned, the explicit relativistic symmetry, which becomes dangerous if, instead of an exact solution, one is content with an approximate one. Therefore, the practical the value of quantization in the form (1) is small. Naib. a method based on the transition to interaction view, in which the field and a(x) satisfy linear equations of motion for free fields, and all the influence of interaction and self-action is transferred to the temporal evolution of the amplitude of the state Ф, which is now not constant, but changes in accordance with an equation like the Schrödinger equation:

moreover Hamiltonian interactions h int(t) in this representation depends on time through the fields and a(x), obeying free equations and relativistic-covariant permutation relations (2); thus, it turns out to be unnecessary to explicitly use the canonical commutators (1) for interacting fields. For comparison with experiment, the theory must solve the problem of particle scattering, in the formulation of which it is assumed that asymptotically, as t""-:(+:) the system was in a stationary state (will come to a stationary state) Ф_ : (Ф + :), and Ф b: are such that the particles in them do not interact due to large mutual distances (see also Adiabatic hypothesis), so that all the mutual influence of particles occurs only at finite times near t=0 and transforms Ф_ : into Ф + : = S F_ : . Operator S called scattering matrix(or S-matrix); through the squares of its matrix elements

the probabilities of transitions from the given beginning are expressed. state F i in some final state Ф f, i.e. eff. sections diff. processes. That., S-matrix allows you to find the probabilities of physical. processes without delving into the details of the temporal evolution described by the amplitude Ф( t). Nonetheless S-matrix is ​​usually built on the basis of equation (8), which admits a formal solution in a compact form:
.

using the operator T chronological an ordering that arranges all field operators in descending order of time t=x 0 (see Chronological work). Expression (10), however, is rather symbolic. procedure record follow. integration equation (8) from -: to +: over infinitesimal time intervals ( t, t+D t) rather than a usable solution. This can be seen at least from the fact that for the smooth calculation of matrix elements (9) it is necessary to represent the scattering matrix in the form of not chronological, but normal product, in which all creation operators are to the left of the annihilation operators. The task of transforming one work into another is the real difficulty and cannot be solved in general terms.
4. Perturbation theory For this reason, for a constructive solution of the problem, one has to resort to the assumption that the interaction is weak, i.e., the smallness of the interaction Lagrangian L int. Then you can decompose chronologically. exponent in expression (10) in series perturbation theory, and the matrix elements (9) will be expressed in each order of the perturbation theory in terms of the matrix elements not chronologically. exponents, but simple chronological. products of the corresponding number of interaction Lagrangians:

(P is the order of the perturbation theory), i.e., it will be necessary to transform to the normal form not exponentials, but simple polynomials of a specific type. This task is practically carried out with the help of technology Feynman diagrams and Feynman rules. In the Feynman technique, each field and a(x) is characterized by its causal Green's function ( propagator or spread function) Dc aa"(x-y), depicted in the diagrams by a line, and each interaction - by a coupling constant and a matrix factor from the corresponding term in L int shown on the diagram summit. The popularity of the Feynman diagram technique, in addition to its ease of use, is due to its clarity. The diagrams make it possible, as it were, to present with one's own eyes the processes of propagation (lines) and interconversions (vertices) of particles - real in the beginning. and final states and virtual in intermediate (on internal lines). Particularly simple expressions are obtained for the matrix elements of any process in the lowest order of perturbation theory, which correspond to the so-called tree diagrams that do not have closed loops - after the transition to the impulse representation, there are no integrations left in them at all. For basic QED processes, such expressions for matrix elements were obtained at the dawn of QFT in con. 20s and turned out to be in reasonable agreement with experiment (correspondence level 10 - 2 -10 - 3 , i.e., of the order of the fine structure constant a). However, attempts to calculate radiative corrections(i.e., corrections associated with taking into account higher approximations) to these expressions, for example, to Klein - Nishina - Tamm f-le (see. Klein - Nishina formula) for Compton scattering, ran into specific. difficulties. Diagrams with closed loops of lines correspond to such corrections virtual particles, whose momenta are not fixed by conservation laws, and the total correction is equal to the sum of contributions from all possible momenta. It turned out that in most cases the integrals over the momenta of virtual particles arising from the summation of these contributions diverge in the UV region, i.e., the corrections themselves turn out to be not only not small, but infinite. According to the uncertainty relation, small distances correspond to large impulses. Therefore, one can think that the physical The origins of the divergences lie in the idea of ​​the locality of the interaction. In this regard, we can speak of an analogy with the infinite energy of the el-magn. field of a point charge in the classical. electrodynamics.
5. Divergences and renormalizations Formally, mathematically, the appearance of divergences is due to the fact that the propagators D c (x) are singular (more precisely, generalized) functions that have in the vicinity of the light cone at x 2 ~0 X 2. Therefore, their products arising in matrix elements, which correspond to closed loops in diagrams, are poorly defined with Math. points of view. Impulse Fourier images of such products may not exist, but - formally - be expressed in terms of divergent impulse integrals. For example, the Feynman integral
(where R- external 4-impulse, k- integration momentum), corresponding to the simplest one-loop diagram with two internal. scalar lines (Fig.), does not exist.

He is proportional. Fourier transform of the propagator square D c (x)scalar field and diverges logarithmically at the upper limit (i.e., in the UV region of virtual momenta | k|"":, so that, for example, if the integral is cut off at the upper limit at | k|=L, then

where I con ( R) is the final expression.
The problem of UV divergences was solved (at least from the point of view of obtaining finite expressions for the majority of physically interesting quantities) in the 2nd half. 40s based on the idea of ​​renormalizations (renormalizations). The essence of the latter is that the infinite effects of quantum fluctuations corresponding to closed loops of diagrams can be separated into factors that have the character of corrections to the initial characteristics of the system. As a result, the masses and coupling constants g change due to the interaction, i.e., they are renormalized. In this case, due to the UV divergences, the renormalizing additions turn out to be infinitely large. Therefore, the renormalization relations

m 0 ""m=m 0 + D m=m 0 Zm (. . .),

g 0 ""g = g 0+D g = g 0 Z g(. . .)

(where Zm, Z g- renormalization factors), linking the original, so-called. seed masses m 0 and seed charges (i.e. coupling constants) g 0 with physical t, g, turn out to be singular. In order not to deal with meaningless infinite expressions, one or another auxiliary is introduced. regularization of divergences(similar to the cutoff used in (13) at | k|=L. In the arguments (indicated in the right parts of (14) by dots) radiats. amendments D m, D g, as well as the renormalization factors Z i, besides t 0 and g 0 , contains singular dependencies on auxiliary parameters. regularization. The divergences are eliminated by identifying the renormalized masses and charges m and g with their physical values. In practice, to eliminate divergences, the method of introducing into the original Lagrangian is also often used countermembers and express t 0 and g 0 in the Lagrangian in terms of physical m and g formal relations inverse to (14). Expanding (14) into series in physical. interaction parameter:

t 0 = t + gM 1 + g 2 M 2 + ..., g 0 = g + g 2 G 1 + g 3 G 2 + ...,

select singular coefficients M l, G l thus, to exactly compensate for the divergences that arise in the Feynman integrals. The class of QFT models for which such a program can be carried out sequentially in all orders of perturbation theory and in which, therefore, all UV divergences without exception can be "removed" into renormalization factors of masses and coupling constants, called a class of renormalizable theories. In the theories of this class, all matrix elements and Green's functions are, as a result, expressed in a non-singular way in terms of physical. masses, charges and kinematics. variables. In renormalizable models, it is therefore possible, if desired, to abstract completely from the bare parameters and UV divergences, considered separately, and fully characterize the results of the theoretical. calculations by setting a finite number of physical. values ​​of masses and charges. Mat. the basis of this assertion is Bogolyubov - Parasyuk theorem about renormalizability. A fairly simple recipe for obtaining finite single-valued expressions for matrix elements follows from it, formalized in the form of the so-called. R-operations Bogolyubov. At the same time, in non-renormalizable models, an example of which is the now obsolete formulation in the form of a four-fermion local Fermi Lagrangian, it is not possible to "assemble" all the divergences into "aggregates" that renormalize masses and charges. Renormalizable models of QFT are characterized, as a rule, by dimensionless coupling constants, logarithmically divergent contributions to the renormalization of coupling constants and fermion masses, and quadratically divergent radii. corrections to the masses of scalar particles (if any). For such models, as a result of the renormalization procedure, we obtain renormalized perturbation theory, to-heaven and serves as the basis for practical. calculations. In renormalizable QFT models, an important role is played by renormalized Green's functions (dressed propagators) and top parts, including interaction effects. They can be represented by infinite sums of terms corresponding to increasingly complex Feynman diagrams with a fixed number and type of external. lines. For such quantities, one can give formal definitions either through vacuum medium chronological products of field operators in the interaction representation and the S-matrix (which is equivalent to vacuum averages of T-products of complete, i.e. Heisenberg, operators), or through functional derivatives of generating functional Z(J), expressed through the so-called. expanded scattering matrix S( J), functionally dependent on the auxiliary. classical sources J a (x) fields and a(x). The formalism of generating functionals in QFT is analogous to the corresponding statistical formalism. physics. It allows you to get for the complete Green's functions and vertex functions ur-tions in functional derivatives - Schwinger equations, from which, in turn, one can obtain an infinite chain of integro-differentials. ur-ny - -Dyson equations. The latter are like a chain of ur-tions for correlations. f-tsy statistic. physics.
6. UV asymptotics and the renormalization group UV divergences in QFT are closely related to high-energy. asymptotics of renormalized expressions. For example, logarithm. divergence (12) of the simplest Feynman integral I(p) answers logarithmic. asymptotics

final regularized integral (13), as well as the corresponding renormalized expression. Since in renormalizable models with dimensionless coupling constants the divergences are mainly logarithmic. character, UV asymptotics l-loop integrals, as a rule (an exception is the case doubly logarithmic asymptotics), have here the typical structure ( gL)l, where L=ln(- R 2/m2), p is a "large" momentum, and m is some parameter of the mass dimension that arises in the process of renormalization. Therefore, for sufficiently large | R 2 | the growth of the logarithm compensates for the smallness of the coupling constant g and the problem arises of determining an arbitrary term of a series of the form

and summing such a series ( a lm- numerical coefficients). The solution of these problems is facilitated by using the method renormalization group, which is based on the group character of finite transformations analogous to the singular renormalization functions (14) and the Green's transformations accompanying them. In this way, it is possible to effectively sum certain infinite sets of contributions from Feynman diagrams and, in particular, to represent double expansions (15) as single expansions:

where functions f l have a characteristic geom. progressions or combinations of a progression with its logarithm and exponent. It turns out to be very significant here that the condition for the applicability of f-l of the type (15), which has the form g<<1, gL<< 1 is replaced by a much weaker one: - so-called. invariant charge, which in the simplest (one-loop) approximation has the form of a sum of geom. progression in argument GL: (b 1 - numerical coefficient). For example, in QED the invariant charge is proportional to the transverse part of the photon propagator d, in the one-loop approximation turns out to be equal to

moreover, at k 2 /m 2 >0 L=ln( k 2/m2)+ i p( k- 4-momentum of a virtual photon). This expression, which is the sum of Ch. logarithms of the form a(a L)n, has a so-called. ghost pole at k 2 =-m 2 e 3 p/a spectral representation for a photon propagator). The presence of this pole is closely related to the problem of the so-called. zero-charge,t. e. turning the renormalized charge to zero at a finite value of the "seed" charge. The difficulty associated with the appearance of a ghostly pole has sometimes been interpreted even as proof of ext. inconsistency of QED, and the transfer of this result to the traditional. renormalizable models of the strong interaction of hadrons - as an indication of the inconsistency of the entire local QFT as a whole. However, such cardinal conclusions, made on the basis of fl Ch. logarithm. approximations turned out to be hasty. Already taking into account the “following the main” contributions ~a 2 (a L)m, leading to the two-loop approximation, shows that the position of the pole shifts noticeably. A more general analysis within the framework of the renormalization method. group leads to the conclusion about the applicability of f-ly (16) only in the region i.e., about the impossibility of proving or refuting the existence of a "polar contradiction" on the basis of one or another resummation of the series (15). Thus, the paradox of the phenomenon of the ghost pole (or the renormalization of the charge to zero) turns out to be ghostly - to decide whether this difficulty really appears in theory, it would be possible only if we were able to obtain unambiguous results in the region of strong coupling. For now, only the conclusion remains that, as applied to spinor QED, perturbation theory is not, despite the unconditional smallness of the expansion parameter a, a logically closed theory. For QED, however, this problem could be considered purely academic, since, according to (16), even at giant energies ~(10 15 -10 16) GeV, considered in the modern. models of combining interactions, the condition is not violated. The situation in quantum mesodynamics, the theory of the interaction of pseudoscalar meson fields with nucleon fermionic fields, looked much more serious. 60s unity candidate for the role of a renormalizable model of the strong interaction. In it, the effective coupling constant was large at ordinary energies, and - clearly illegitimate - consideration by perturbation theory led to the same difficulties of the null charge. As a result of all the studies described, a somewhat pessimistic view has emerged. point of view on the future prospects of renormalizable QFT. From a purely theoretical point of view it seemed that qualities. the variety of such theories is negligible: for any renormalizable model, all interaction effects - for small coupling constants and moderate energies - were limited to an unobservable change in the characteristics of free particles and the fact that quantum transitions occurred between states with such particles, to the probabilities of the lowest approximation to which it was now possible to calculate (small) corrections of higher ones. For large coupling constants or asymptotically large energies, the available theory - again, regardless of the specific model - was inapplicable. QED remained the only (really brilliant) application to the real world that satisfies these limitations. This situation contributed to the development of non-Hamiltonian methods (such as axiomatic quantum field theory, algebraic approach in KTP, constructive quantum field theory). Great hopes were placed on dispersion relation method and research analytics. properties of the S-matrix. Mn. researchers began to look for a way out of the difficulties in the way of revision of the main. provisions of the local renormalization of QFT with the help of the development of non-canonical. directions: essentially non-linear (that is, non-polynomial), non-local, non-definite (see Nonpolynomial quantum field theories, Nonlocal quantum field theory, Indefinite metric), etc. The source of new views on the general situation in QFT was the discovery of new theoretical. facts related to non-abelian calibration fields. 7. Calibration fields Gauge fields (including non-Abelian Yanga - Mills fields) are related to the invariance with respect to some group G local gauge transformations. The simplest example of a gauge field is el-magn. field A m in QED associated with an abelian group U(l). In the general case of unbroken symmetry, the Yang-Mills fields, like the photon, have zero rest mass. They are converted by the attached group representation G, carry the corresponding indices B ab m ( x) and obey non-linear equations of motion (which are linearized only for an Abelian group). Their interaction with matter fields will be gauge invariant if it is obtained by extending derivatives (see Fig. covariant derivative): in the free Lagrangian of the field and with the same dimensionless constant g, which enters into the Lagrangian of the field AT. Like e-mag. field, Yang-Mills fields are constrained systems. This, as well as the apparent absence in nature of massless vector particles (other than photons), limited interest in such fields, and for more than 10 years they were considered rather as an elegant model that has nothing to do with the real world. The situation changed to the 2nd floor. 60s, when they were able to be quantized by the functional integration method (see. Functional integral method) and find out that both the pure massless Yang-Mills field and the field interacting with fermions are renormalizable. Following this, a method was proposed for "soft" introduction of masses into these fields using the effect spontaneous symmetry breaking. Based on it Higgs mechanism allows us to communicate the mass to the quanta of the Yang-Mills fields without violating the renormalizability of the model. On this basis, in con. 60s was built a unified renormalizable theory of the weak and el-magn. interactions (see electroweak interaction), in which the carriers of the weak interaction are heavy (with masses ~ 80–90 GeV) quanta of vector gauge fields of the electroweak symmetry group ( intermediate vector bosons W 6 and Z 0 experimentally observed in 1983). Finally, at the beginning 70s note was found. property of non-Abelian QFTs - asymptotic freedom.It turned out that, in contrast to all renormalizable QFTs studied so far, for the Yang-Mills field, both pure and interacting with a bounded the number of fermions, Ch. logarithm. contributions to the invariant charge have a total sign opposite to the sign of such contributions to QED:

Therefore, in the limit | k 2 |"": an invariant charge and there are no difficulties in passing to the UV limit. This phenomenon of self-switching off of interaction at small distances (asymptotic freedom) made it possible to naturally explain in the gauge theory of strong interaction - quantum chromodynamics(QCD) parton structure of hadrons (see Partons), which had manifested itself by that time in experiments on deep inelastic scattering of electrons by nucleons (see Deep inelastic processes). The symmetry basis of QCD is the group SU(3) s, acting in the space of the so-called. color variables. Non-zero color quantum numbers are attributed to quarks and gluons. The specificity of color states is their unobservability at asymptotically large spatial distances. At the same time, the baryons and mesons that are clearly manifested in the experiment are singlets of the color group, i.e., their state vectors do not change during transformations in the color space. When reversing the sign b [cf. (17) with (16)] the difficulty of the ghostly pole passes from high energies to small ones. It is not yet known what QCD gives for ordinary energies (of the order of hadron masses), - there is a hypothesis that with increasing distance (i.e., with decreasing energy), the interaction between colored particles grows so strongly that it is precisely this that does not allow quarks and gluons to disperse at a distance / 10 - 13 cm (the hypothesis of non-flying, or confinement; see. Color retention).Very much attention is paid to the study of this problem. Thus, the study of quantum field models containing Yang-Mills fields revealed that renormalizable theories can have an unexpected richness of content. In particular, the naive belief that the spectrum of an interacting system is qualitatively similar to the spectrum of a free system has been destroyed and differs from it only in a shift of levels and, possibly, in the appearance of a small number of bound states. It turned out that the spectrum of a system with interaction (hadrons) may have nothing in common with the spectrum of free particles (quarks and gluons) and therefore may not even give any indication of this. fields of which varieties should be included in the elementary microscopic. Lagrangian. Establishing these essential qualities. features and holding the vast majority of quantities. calculations in QCD are based on a combination of perturbation theory calculations with the requirement of renormalization group invariance. In other words, the renormalization group method has become, along with the renormalized perturbation theory, one of the main computational tools of the modern. KTP. Dr. QFT method, which received means. development since the 1970s, especially in the theory of non-Abelian gauge fields, is, as already noted, a method that uses the functional integral method and is a generalization to QFT of quantum mechanics. path integral method. In QFT, such integrals can be considered as averaging f-ly of the corresponding classical. expressions (eg, the classical Green's functions for a particle moving in a given external field) in terms of quantum field fluctuations. Initially, the idea of ​​transferring the functional integral method to QFT was associated with the hope of obtaining compact closed expressions for the basic. quantum field quantities suitable for constructive calculations. However, it turned out that because of the difficulties of Math. character, a rigorous definition can only be given to integrals of Gaussian type, which are the only ones that lend themselves to exact calculation. Therefore, the functional integral representation was considered for a long time as a compact formal representation of the quantum field perturbation theory. Later (distracting from the mathematical problem of justification) they began to use this representation in decomp. general tasks. Thus, the representation of the functional integral played an important role in the work on the quantization of Yang-Mills fields and the proof of their renormalizability. Interesting results were obtained using the procedure developed somewhat earlier for problems of quantum statistics for calculating the functional integral of the functional pass method, similar to the saddle point method in the theory of functions of a complex variable. For a number of fairly simple models, using this method, it was found that the quantum field quantities, considered as functions of the coupling constant g, have near the point g=0 singularity of characteristic type exp(- 1 /g) and that (in full accordance with this) the coefficients f n power expansions S f n g n perturbation theories grow at large P factorial: f n~n!. Thus, the statement made at the beginning was constructively confirmed. 50s the hypothesis of non-analyticity of the theory with respect to charge. Analytical plays an important role in this method. solutions of nonlinear classical ur-tions that have a localized character ( solitons and - in the Euclidean version - instantons) and delivering a minimum to the action functional. In the 2nd floor. 70s within the framework of the method of functional integration, a direction arose for studying non-Abelian gauge fields with the help of the so-called. contour , in k-poii as arguments instead of 4D points X closed contours Г in space-time are considered. In this way, it is possible to reduce the dimension of the set of independent variables by one and, in a number of cases, significantly simplify the formulation of the quantum field problem (see Sec. contour approach). Successful research has been carried out with the help of a numerical calculation on a computer of functional integrals, approximately represented in the form of iterated integrals of high multiplicity. For such a representation, a discrete lattice is introduced in the initial space of configuration or impulse variables. Similar, as they are called, "lattice calculations" for realistic. models require the use of computers of particularly high power, as a result of which they are only beginning to become available. Here, in particular, an encouraging calculation of the masses and anomalous magnets was carried out using the Monte Carlo method. moments of hadrons on the basis of quantum chromodynamic. representations (see Lattice method).
8. Big picture The development of new ideas about the world of particles and their interactions increasingly reveals two fundamentals. trends. This is, firstly, a gradual transition to more and more indirect concepts and less and less visual images: local gauge symmetry, the renormalizability imperative, the concept of broken symmetries, as well as spontaneous symmetry breaking, and gluons instead of actually observed hadrons, the unobservable quantum number of color and etc. Secondly, along with the complication of the arsenal of methods and concepts used, there is an undoubted manifestation of the features of the unity of the principles underlying the phenomena that seem to be very far from each other, and as a result of this, it means. simplification of the overall picture. Three basic interactions studied using QFT methods received a parallel formulation based on the principle of local gauge invariance. A related property of renormalizability gives the possibility of quantities. calculation of the effects of e-magn., weak and strong interactions by the method of perturbation theory. (Since the gravitational interaction can also be formulated on the basis of this principle, it is probably universal.) With practical. from the perturbation theory point of view, have long established themselves in QED (for example, the degree of correspondence between theory and experiment for anomalous magnetic moment electron Dm is Dm/m 0 ~10 - 10 , where m 0 is the Bohr magneton). In the theory of the electroweak interaction, such calculations also turned out to have a remarkable predictive effect. force (e.g., the masses were correctly predicted W 6 - and Z 0 -bosons). Finally, in QCD in the region of sufficiently high energies and 4-momentum transfers Q (|Q| 2 / 100 GeV 2) on the basis of a renormalizable perturbation theory strengthened by the renormalization method. group, it is possible to quantitatively describe a wide range of phenomena in hadron physics. Due to the insufficient smallness of the expansion parameter: the accuracy of the calculations here is not very high. In general, we can say that, contrary to the pessimism of con. 50s, the method of renormalized perturbation theory turned out to be fruitful, at least for three of the four fundams. interactions. At the same time, it should be noted that most Significant progress, achieved mainly in the 1960s-1980s, relates precisely to understanding the mechanism of the interaction of fields (and particles). Successes in observing the properties of particles and resonant states have yielded abundant material, which has led to the discovery of new quantum numbers (strangeness, charm, etc.) and to the construction of so-called numbers corresponding to them. broken symmetries and the corresponding systematics of particles. This, in turn, gave impetus to the search for substructure numerous. hadrons and, ultimately, the creation of QCD. As a result, such "50s" as nucleons and pions ceased to be elementary and it became possible to determine their properties (mass values, anomalous magnetic moments, etc.) through the properties of quarks and the parameters of the quark-gluon interaction. An illustration of this is, for example, the degree of disturbance of the isotopic. symmetry, which manifests itself in the mass difference D M charge and neutral mesons and baryons in one isotopic. multiplet (for example, p and n; Instead of the original, from the modern point of view naive, idea that this difference (due to the numerical ratio D M/M~ a) has an e-mag. origin, the belief came that it is due to the difference in masses and- and d-quarks. However, even if the quantities are successful. the implementation of this idea, the question is not completely solved - it is only pushed deeper from the level of hadrons to the level of quarks. The formulation of the old riddle of the muon is transformed in a similar way: "Why is the muon needed and why is it, being similar to the electron, two hundred times heavier than it?". This question, transferred to the quark-lepton level, has acquired greater generality and no longer refers to a pair, but to three generations of fermions, but did not change its essence. 9. Prospects and problems Great hopes were placed on the program of the so-called. great unification interactions - combining the strong QCD interaction with the electroweak interaction at energies of the order of 10 15 GeV and higher. The starting point here is the (theoretical) observation of the fact that the extrapolation to the region of superhigh energies of f-ly (17) is asymptotic. freedom for chromodynamic. coupling constants and f-ly type (16) for the invariant charge QED leads to the fact that these quantities at energies of the order of |Q| = M X~10 15 b 1 GeV are compared with each other. The corresponding values ​​(as well as the value of the second charge of the theory of the electroweak interaction ) turn out to be equal to Fundam. physical the hypothesis is that this coincidence is not accidental: in the region of energies greater than M X, there is some higher symmetry described by the group G, which at lower energies splits to observable symmetries due to the mass terms, and the masses that break the symmetries are of the order M X. Concerning the structure of the uniting group G and the nature of the symmetry-breaking members can be made dec. assumptions [naib. simple answer is G=SU(5 )], but with qualities. point of view naib. An important feature of the association is that the funds. view (view - column) group G combines quarks and leptons from fundam. group representations SU(3 )c and SU(2), as a result of which, at energies higher than M X quarks and leptons become "equal". The mechanism of local gauge interaction between them contains vector fields in the adjoint representation (representation - matrix) of the group G, the quanta of which, along with gluons and heavy intermediate bosons of the electroweak interaction, contain new vector particles that link leptons and quarks together. The possibility of transformation of quarks into leptons leads to nonconservation of the baryon number. In particular, the decay of the proton turns out to be allowed, for example, according to the scheme p""e + +p 0 . It should be noted that the grand unification program faced a number of difficulties. One of them is purely theoretical. character (the so-called hierarchy problem - the impossibility of maintaining in higher orders theories of perturbations of incommensurable scales of energies M X~10 15 GeV and M W~10 2 GeV). Dr. the difficulty is connected with the mismatch of the experiments. data on the decay of the proton with theoretical. predictions. A very promising direction for the development of modern. QTP is associated with supersymmetry, i.e., with symmetry with respect to transformations that “entangle” the bosonic fields j ( X) (integer spin) with fermion fields y( x) (half-integer spin). These transformations form a group which is an extension of the Poincare group. The corresponding algebra of group generators, along with the usual generators of the Poincaré group, contains spinor generators, as well as anticommutators of these generators. Supersymmetry can be viewed as a non-trivial union of the Poincaré group with ext. symmetries, a union made possible by the inclusion of anticommuting generators in the algebra. The representations of the supersymmetry group - the superfield Ф - are given on superspaces, including in addition to the usual coordinates X special algebraic. objects (the so-called generators Grassmann algebra with involution) are precisely anticommuting elements that are spinors with respect to the Poincaré group. By virtue of the exact anticommutativity, all the powers of their components, starting from the second, vanish (the corresponding Grassmann algebra is said to be nilpotent), and therefore the expansions of superfields into series in turn into polynomials. For example, in the simplest case of a chiral (or analytic) superfield that depends in def. basis only on q,

(s is the Pauli matrix) will be:

Odds BUT(X), y a ( X), F(x ) are already ordinary quantum fields - scalar, spinor, etc. They are called. component or constituent fields. From the point of view of component fields, a superfield is simply composed by definition. rules a set of a finite number of different Bose and Fermi fields with the usual quantization rules. When constructing supersymmetric models, it is required that interactions are also invariant under supersymmetry transformations, i.e., they represent superinvariant products of superfields as a whole. From the usual point of view, this means the introduction of a whole series of interactions of component fields, interactions, the constants of which are not arbitrary, but are rigidly connected with each other. This opens up hope for an exact compensation for all or at least some of the UV divergences originating from different terms of the interaction. We emphasize that an attempt to implement such a compensation simply for a set of fields and interactions not limited by group requirements would be futile due to the fact that once the established compensation would be destroyed during renormalizations. Of particular interest are supersymmetric models containing non-Abelian gauge vector fields as components. Such models, which have both gauge symmetry and supersymmetry, are called. supercalibration. In supercalibration models, a noticeable difference is observed. the fact of reduction of UV divergences. Models are found in which the interaction Lagrangian, when expressed in terms of component fields, is represented by a sum of expressions, each of which is individually renormalizable and generates a perturbation theory with a logarithm. divergences, however, the divergences corresponding to the sum of Feynman diagrams with the contributions of diff. members of the virtual superfield compensate each other. This property of the complete reduction of the divergence can be put in parallel with the well-known fact of the decrease in the degree of UV divergence of the eigenvalues. electron mass in QED in the transition from the original non-covariant calculations of the late 20s. to a virtually covariant perturbation theory that takes into account positrons in intermediate states. The analogy is strengthened by the possibility of using Feynman's supersymmetric rules when such divergences do not appear at all. The complete cancellation of UV divergences in arbitrary orders of perturbation theory, established for a number of supergauge models, gave rise to hope for a theoretical. the possibility of fundam superunification. interactions, i.e., such a union of all four interactions, including the gravitational one, constructed taking into account supersymmetry, for which not only the non-renormalizable effects of "ordinary" quantum gravity disappear, but also the completely unified interaction will be free from UV divergences. Phys. the arena of superunifications are scales of the order of the Planck scales (energies ~10 19 GeV, distances of the order of the Planck length R Pl ~10 - 33 cm). To implement this idea, supergauge models are considered based on superfields arranged in such a way that max. the spin of their constituent ordinary fields is equal to two. The corresponding field is identified with the gravitational one. Similar models are called supergravity (cf. supergravity). attempts to construct finite supergravities use ideas about Minkowski spaces with more than four dimensions, as well as about strings and superstrings. In other words, the "usual" local QFT at distances less than Planck's turns into a quantum theory of one-dimensional extended objects embedded in spaces of a higher number of dimensions. In the event that such a superunification based on supergravity. If a model for which the absence of UV divergences is proved occurs, then a unified theory of all four fundams will be constructed. interactions, free from infinities. Thus, it will turn out that UV divergences will not arise at all, and the entire apparatus for eliminating divergences by the renormalization method will turn out to be unnecessary. As for the nature of the particles themselves, it is possible that the theory is approaching a new quality. a milestone associated with the emergence of ideas about the level of elementarity higher than the quark-lepton level. We are talking about the grouping of quarks and leptons into generations of fermions and the first attempts to raise the question of different scales of masses of different generations based on the prediction of the existence of particles that are more elementary than quarks and leptons. Lit.: Akhiezer A. I., Berestetsky V. B., Quantum electrodynamics, 4th ed., M., 1981; Bogolyubov N. N., III and rk about in D. V., Introduction to the theory of quantized fields, 4th ed., M., 1984; theirs, Quantum Fields, Moscow, 1980; Berestetsky V. B., Lifshitz E. M., Pitaevsky L. P., Quantum electrodynamics, 2nd ed., M., 1980; Weisskopf, VF, How we grew up with field theory, trans. from English, UFN, 1982, v. 138, p. 455; And tsikson K., 3 yuber J-B., Quantum field theory, transl. from English, vol. 1-2, M., 1984; Bogolyubov N. N., Logunov A. A., Oksak A. I., Todorov I. T., General principles of quantum field theory, Moscow, 1987. B. V. Medvedev, D. V. Shirkov.