The semantics of a language does not apply to the concept. See what "Semantics" is in other dictionaries

Semantics, in the broadest sense of the word, is the analysis of the relationship between linguistic expressions and the world, real or imaginary, as well as this relationship itself (cf. an expression like the semantics of a word) and the totality of such relationships (thus, one can talk about the semantics of a certain language). This relation consists in the fact that linguistic expressions (words, phrases, sentences, texts) denote what is in the world - objects, qualities (or properties), actions, methods of performing actions, relationships, situations and their sequences. The term "semantics" is derived from a Greek root associated with the idea of ​​"designation" (cf. semantikos "signifying"). The relationship between natural language expressions and the real or imaginary world is explored by linguistic semantics, which is a branch of linguistics. Semantics is also called one of the sections of formal logic, which describes the relationship between the expressions of artificial formal languages ​​and their interpretation in some model of the world. This article deals with linguistic semantics.

Semantics as a branch of linguistics answers the question of how a person, knowing the words and grammatical rules of a natural language, is able to convey with their help a wide variety of information about the world (including his own inner world), even if he first encounters with such a task, and to understand what information about the world is contained in any statement addressed to him, even if he hears it for the first time.

The semantic component has long been recognized as a necessary part of a complete description of a language - grammar. Various theories of language contribute to the formation of general principles of semantic description. For example, for generative grammars, the principles for constructing a semantic component were laid down by the American linguists J. Katz and J. Fodor and further developed by R. Jackendoff, and, say, for grammars (models) of the "Meaning - Text" type, the corresponding component was developed by representatives of the Moscow Semantic School: Yu .D. Apresyan, A.K. Zholkovsky, I.A. Melchuk, etc. The semantic component necessarily includes a dictionary (lexicon), in which each word is reported what it means, i.e. each word is compared with its meaning in a given language, and the rules for combining (interaction) the meanings of words, according to which the meaning of more complex structures, primarily sentences, is formed from them.

The meaning of a word in a dictionary is described using a dictionary definition, or interpretation, which is an expression in the same natural language or in an artificial semantic language specially developed for this purpose, in which the meaning of the interpreted word is presented more fully (explicitly) and, ideally, , strictly. Thus, the meaning of the Russian word bachelor in the dictionary of the semantic component of the description of the Russian language can be represented, as is done in ordinary explanatory dictionaries, in the form of the usual Russian phrase "a man who has reached marriageable age and is not married and has never been married" or in the form of an entry in a special semantic language, for example, (?x) [HUMAN (x) & MALE (x) & ADULT (x) & (MARRIED (x)]. differently.

As can be seen from the above examples, when interpreting the meanings of words and phrases using natural language, the resulting expressions, as well as their individual components, if they are mentioned separately, are usually written in single quotes in writing; dictionaries do not do this, because from the very structure of the dictionary entry it is already clear that to the right of the word that is the entrance to the entry in the explanatory dictionary is precisely the interpretation of this word. Naturally, language expressions that interpret the meaning of sentences are usually written in double quotes. Recording natural language words in capital letters and using hyphens in unusual places means that these words in this notation are elements of an artificial language that may not coincide with natural language; so, MARRIED is one element, not three words; the variable x and the conjunction sign & are also elements of an artificial language. Artificial languages ​​can be used to interpret the meanings of both words and sentences. Regardless of whether natural or artificial language is used for interpretation, in relation to the language whose expressions are interpreted, it has the status of a metalanguage (from the Greek meta "after"), i.e. the language spoken about the language; natural language can thus be a metalanguage in relation to itself. Metalanguage elements can also be (and often are, for example, in illustrated dictionaries) various kinds of graphic images - diagrams, drawings, etc.

The semantic component of a complete description of a language is a model of that part of language knowledge that is related to the relationship between words and the world. In this model, such empirically established phenomena as equivalence (synonymy), ambiguity (polysemy), semantic anomaly (including inconsistency and tautology) of linguistic expressions should be explained. So, it is easy to check that for all native Russian speakers the sentence

"He wore a wide-brimmed hat" denotes the same state of affairs as the sentence "He wore a wide-brimmed hat."

It is believed that this fact is adequately reflected in the semantic component of the language description if, taking the interpretation of the meanings of the corresponding words from the dictionary and acting according to the explicitly specified rules for combining meanings, we get the same semantic records, called "semantic representations" or "semantic interpretations" of these sentences. In the same way, all native Russian speakers would agree that the sentence "Visiting relatives can be tiring" denotes two different possibilities: the possibility of being weary visiting relatives, and the possibility of being weary receiving relatives who have visited you. This means that in the semantic component, this sentence must be compared with two semantic representations that differ from each other, otherwise it will not be an adequate reflection of semantic knowledge about the Russian language.

As an independent linguistic discipline, semantics emerged relatively recently, at the end of the 19th century; the term "semantics" itself to designate a branch of science was first introduced in 1883 by the French linguist M. Breal, who was interested in the historical development of linguistic meanings. Until the end of the 1950s, along with it, the term "semasiology" was also widely used, now preserved only as a not very common name for one of the sections of semantics. However, questions related to the conduct of semantics were raised and, one way or another, resolved already in the oldest linguistic traditions known to us. After all, one of the main reasons forcing us to pay attention to the language is a misunderstanding of what the oral or written statement (text) addressed to us, or some part of it, means. Therefore, in the study of language, the interpretation of individual signs or entire texts - one of the most important activities in the field of semantics - has long occupied an important place. So, in China, in ancient times, dictionaries were created that contained interpretations of hieroglyphs. In Europe, ancient and medieval philologists compiled glosses, i.e. interpretation of incomprehensible words in written monuments. A truly rapid development of linguistic semantics began in the 1960s; at present, it is one of the most important sections of the science of language.

In the European scientific tradition, the question of the relationship between words and "things", the objects to which they belonged, was first raised by ancient Greek philosophers, but to this day various aspects of this relationship continue to be clarified. Let us consider the relation of the word to the "thing" more closely. Words allow us to mention things, both in their presence and in their absence - to mention not only what is "here", but also what is "there", not only the present, but also the past and the future. Of course, the word is just noise that has come to be used to talk about something; in itself this noise has no meaning, but acquires it through its use in the language. When we learn the meanings of words, we learn not some fact of nature, like the law of gravity, but a kind of convention about what noises usually correspond to what things.

The words of a language, being used in speech, acquire a relation or reference to the objects of the world about which the statement is made. In other words, they have the ability to "refer" to objects, introducing these objects (of course, in an ideal form) into the consciousness of the addressee. (Of course, it would be more accurate to say that speakers, using words, can "refer" to this or that fragment of the world). That entity in the world to which the word refers is called its referent. So, if I, describing an event that happened to someone, say: Yesterday I planted a tree under my window, then the word tree refers to a single individual entity - the same one-of-a-kind tree that I planted yesterday under my window. We may well say that the word tree in this statement means this very tree I have planted. Perhaps this real individual essence is the meaning of the word tree?

Representatives of that relatively young trend in semantics, which is commonly called "strong semantics" (it can include "formal semantics" and other varieties of model-theoretic semantics that follow formal logic in resolving the issue of the nature of relations between language and the world), would give a positive answer to this question. In any case, from the point of view of “strong semantics”, the goal of a semantic description of a language is to ensure that each linguistic expression receives an interpretation in one or another model of the world, i.e. in order to establish whether any element (or configuration of elements) of the world model corresponds to this expression, and if so, which one (which one). Therefore, the problems of reference (relationship to the world) are in the focus of "strong semantics".

In contrast, the more traditional "weak semantics" in the study of the relationship between language and the world dispenses with direct reference to the actual state of affairs in this world. She recognizes as the subject of her research - the meaning of a linguistic expression - not the element (fragment) of the world to which this expression refers, but the way in which it does it - those rules of use, knowing which a native speaker in a particular situation is able to either implement a reference to the world using this expression, or to understand what it refers to. In the future, we will consider the problems of semantics from this position.

If someone wants to invent a procedure for applying words to the world, it may at first seem to him that for every real entity there must be a word. But if this were so, then the number of words required for this would be as infinite as the number of things and relations in nature is infinite. If every tree in the world needed a separate word, then just for trees alone it would take several million words, plus the same number for all insects, for all blades of grass, etc. If the language were required to comply with the principle of "one word - one thing", then it would be impossible to use such a language.

In fact, there are some words (there are relatively few of them) that really refer to a single thing, and they are called proper names, such as Hans-Christian Andersen or Beijing. But most words are applied not to a single person or thing, but to a group or class of things. The generic name tree is used for each of the many billions of things we call trees. (There are also words for subclasses of trees - maple, birch, elm, etc. - but these are the names of smaller classes, not individual trees.) Running is the name of a class of actions that are distinguished from other actions, such as crawling or walking. Blue is the name of a class of colors that smoothly turn green at one end and blue at the other. Above is the name of a relationship class, not a proper name for the relationship between my ceiling lamp and my desk, because it also applies to the relationship between your ceiling lamp and your desk, and an innumerable number of other relationships. Thus, languages ​​have achieved the necessary economy through the use of class names. A class, or a set of those entities, in relation to which a given linguistic expression (in particular, a word) can be used, is called the denotation or extension of this expression (often, however, the term "denotation" is also used as a synonym for the term "referent" introduced above ). In one of the existing approaches to defining the meaning of a word in semantics, the meaning is precisely the denotation - the set of entities that can be denoted with the help of a given word. But another understanding of meaning is more widespread, in which it is identified with the conditions of its applicability.

What allows us to use a relatively small number of words for so many things is similarity. Things that are sufficiently similar to each other, we call the same name. Trees differ from each other in size, shape, distribution of foliage, but they have some similar features that allow us to call them all trees. When we wish to draw attention to differences within this gigantic general class, we look for more detailed similarities within more sub-groups and thus identify specific tree species. Finally, if we intend to repeatedly mention a particular tree, we can assign a proper name to it (for example, Elm on Povarskaya) in the same way that we name a child or a pet.

In addition to the economy of linguistic means achieved, the existence of generic names has another advantage: it emphasizes the similarities between things that are in many respects different from each other. Pomeranian Spitz and Russian Borzoi are not very similar to each other, however, both belong to the class of dogs. The Hottentot and the American manufacturer are in many respects physically and mentally different, but both belong to the human class. However, the existence of common nouns also carries a possible drawback: the indiscriminate dumping of dissimilar things into a heap can force us to consider only the similarities between things, and not the differences, and therefore think not about the distinctive features that characterize this or that separate thing as an individual, but about a label, standing on this thing (i.e., about a generic term applicable to all things of the same class). “Another pensioner,” the saleswoman thinks, thinking exclusively in labels and stereotypes.

These similarities between things certainly exist in nature before and independently of our use of language. But which of the innumerable similarities of things will be the basis for classification depends on people and their interests. Biologists usually use skeletal structure as the basis for assigning birds and mammals to certain species and subspecies: if a bird has one bone structure, then it is assigned to class X, and if another, then class Y. Birds could be classified not by structure skeleton, but by color: then all yellow birds would receive one generic name, and all red birds - another, regardless of other characteristics. Biologists have not yet classified animals in this way, mainly because the offspring regularly have the same skeletal structure as the parents, not the same color, and biologists would like to be able to apply the same name to the offspring as to the parents. But this is a decision made by people, not by nature; natural things do not appear before us with labels that tell us which sections of the classifications they fall into. Different groups of people with different interests classify things in different ways: a certain animal may be listed by biologists in one classification heading, fur producers in another, and tanners in a third.

Bringing natural objects under classification headings is often a simple matter. For example, animals called dogs usually have long noses and bark and wag their tails when they are happy or excited. Things made by people are also often quite easily subsumed under specific headings: this building belongs to the class of (residential) houses, then to the class of garages, and that one to the class of sheds, etc. But here a problem arises: if a person, say, lives in a garage or a barn, then isn't this building also his home? If the garage was once used to house cars, but in recent years has been used to store firewood, isn't it now a shed? Do we classify a building into one class or another on the basis of its external appearance, or on the basis of the purpose for which it was originally created, or on the basis of what it is currently used for? Obviously, the way in which a particular object is assigned to a class depends on the criterion we use, and we choose the criterion depending on what kind of groupings we are most interested in.

Semantics is a word that came into our language from Greek, where its meaning was "significant". In philology, it was first applied in France by M. Breal, who was engaged not only in the development of the language, but also in history. Many linguists can tell about what semantics is. The term is usually understood as such a science that is devoted to the meaning of the word, the multitude of letters and sentences.

What if it's clearer?

The most general meaning of the term (which is what is usually meant) can be clarified as "lexical semantics". It deals with the meaningful load of individual words. But linguists who study alphabets that have been preserved since antiquity know what the semantic load of individual letters is. Some experts deal with texts, phrases, complete sentences. This area is another field of application of the semantic scientific approach.

Analyzing what semantics is, it is necessary to mention the relationship with other disciplines. In particular, close links with:

  • logic
  • linguistic philosophy;
  • communication theories;
  • anthropology (language, symbols);
  • semiology.

Considering science in more detail, it is necessary to immediately formulate the object that it explores: the semantic field. This is such a complex of terms for which a certain common factor is characteristic.

Object of study

If you ask a philologist what semantics is, a specialist will tell you: the term is used to designate a science that deals not only with the meaningful load of words, but also with philosophical linguistic aspects. In addition, the discipline also extends to the languages ​​used by programmers, to formal logic, and semiotics. By means of the tools developed in semantics, it is possible to carry out an effective analysis of the text. Thanks to this science, it is possible to isolate the correlation of phrases, words, symbols, the relationship with meanings.

However, the described meaning is only a general idea of ​​what semantics is. In fact, the concept is now much broader. It is applied to some specialized philosophies and even within one of the approaches that call on people to change their attitude in the world, to move away from the "consumer culture". This problem has become really relevant in recent years, and one of its solutions has been called "general semantics". Needless to say, she has a lot of fans.

Understanding the essence

It so happened that the semantics of a language is a science for which the problem of understanding is very relevant. Simply put, the layman can easily say what mathematics or physics does, but not everyone will quickly find their bearings in the field of semantic research. Surprisingly, not so much linguists as psychologists have set themselves the task of formulating an understanding of the essence of semantics. At the same time, the interpretation of symbols, signs, is a question peculiar to strictly linguistics and no other science. The meaning is sought taking into account the environment in which the objects were used: the specifics of the community, context, circumstances.

Linguistic semantics pays special attention to facial expressions, body movements, sounds as ways of transmitting information. All this forms a meaningful context. For written language, the role of such structural factors is played by paragraphs and punctuation marks. The general term for this area of ​​information is semantic context. Analytical activity in the field of semantics is closely connected with a number of related disciplines dealing with vocabulary, etymology of symbols and words, spelling and pronunciation rules. Science is also connected with pragmatics.

Features of science

The semantics of the language deals with strictly defined issues; it is characterized by a specific area of ​​knowledge. The properties of this discipline often make it possible to characterize it as synthetic. The area under consideration is closely intertwined with linguistic philosophy, has a connection with philology, semiotics. At the same time, there is a sharp contrast with the syntactic rules, combinatorics, which does not pay attention to the semantic load of the symbols and signs used.

A feature of semantics is the presence of significant connections with representative semantic theories, including those considering relationships, correspondences, and the truth of meaning. It is no longer just a science of language, but a philosophical discipline that focuses on reality and its reflection through the possibilities of language.

Linguistics

This science is one of the additional areas included in the general tree of semantics as a research discipline. The object of attention of this sphere of semantics is vocabulary. Linguistics deals with the semantic load characteristic of vocabulary levels, sentences and phrases. Equally, linguistics analyzes larger objects - texts, extended narratives.

When studying linguistics and semantics, it is necessary to clearly understand the close relationships of subjects. For linguistics, cross-references and applied designations become important. The peculiarity of this direction is the study of relationships inherent in the units of linguistics. Like the semantics of the sentence, linguistics pays special attention to phrases, however, in a slightly different way. Here researchers focus on homonyms, anonyms, synonyms, paronyms, metronomes. The task facing them is to comprehend rather large elements of the text, arranging them from small ones, and to expand the semantic load as much as possible.

Montagian grammar

The author of this structure of semantics was Richard Montagu. He first voiced his theories in 1960. The idea was to organize the definitions in such a way that would use the terminology of the lambda calculus. The materials he demonstrated clearly proved that the meaning inherent in the text can be decomposed into parts, elements, using the rules of combination. Attention was also drawn to the fact of the relative scarcity of such rules.

At that time, the term "semantic atom" was first used. His understanding, as well as work with primitives, formed the basis of the semantics of questions in the seventies. This is how the thought hypothesis began to develop. And today, many recognize that Montagu's grammar was an exceptionally well-proportioned, logical invention. Unfortunately, its difference from the semantics of speech was the pronounced variability determined by the context. Language, as Montagu believed, was not just a system of labels assigned to objects and phenomena, but a tool set. He drew attention to the fact that the significance of each of these tools is not in relation to specific objects, but in the specifics of functioning.

What about examples?

The semantics in Montagu's reading is well illustrated as follows. Philologists are familiar with the concept of "semantic uncertainty". This is a situation when, in the absence of a number of parts of the context, it is impossible to determine the exact meaning of an object (word, phrase). Moreover, there are no such words, the identification of which would be absolutely possible and correct in the absence of an environment.

Formal semantics

This idea was formulated as an improvement on Montagu's postulates. It belongs to the theoretical highly formalized approaches and works with natural languages. Russian semantics can also be analyzed by this method. The peculiarity is in assigning values ​​to various units: truth, functional dependence, individuality. For each unit, then truth is revealed, the ratio in the aspects of logic relative to other sentences. All this allows you to get a sufficient amount of information for analyzing the text as a whole.

True conditional semantics

The author of this theory was Donald Davidson. The theory belongs to the number of formalized ones. The main idea is to determine the links between sentences. The approach involves working with natural languages. The semantics of a word, sentence, text obliges to seek out and describe such conditions under which some object of study becomes true.

For example, only in a situation where the snow is white, the expression "snow is white" will be true. That is, in fact, the task of the philologist is to determine under what conditions the meaning of the phrase becomes true. In the semantics of a word, a set of values ​​selected on the basis of a particular object is predetermined, and a set of rules is set that allows them to be combined. The practical application of this method is the formation of abstract models, at the same time, the essence of the approach is in determining the correspondence between expressions and real things and events, and not abstract modeling results.

Artificial semantics

This term is commonly understood as such phrases, words, on the basis of which useful content is formed. The task of a linguist is to compose a semantic core that will attract the reader's attention. This term is most relevant today when applied to modern technologies, in particular, the Internet. To increase the attendance of a virtual page, it is important to form its text content in such a way that there are keys that can interest the user. Artificial semantics is currently widely used for advertising purposes.

Informatics proposes to interpret semantics as a section that deals with the meaningfulness of the constructions inherent in the language. This is, to some extent, the opposite of syntax, whose purview is the form of expression of constructs. Semantics is a set of rules for interpreting syntax. At the same time, the values ​​are set indirectly, the possibilities of understanding the declared words and symbols are only limited. It is customary to talk about semantics as relations, properties that give a formal idea of ​​an object. A logical approach is applied, on the basis of which models and theories are built based on the interpretation of the information received.

Semantics as a method of project promotion

By applying the basic rules of semantics, a specialist can develop such a core, which will then be the basis for the formation of an SEO program. The semantic core is a list of queries that the audience can enter in a virtual search system in order to get acquainted with the services and goods they need. To form such a core correctly, you need to imagine what the client needs, what goals he faces.

Determining the needs of the target audience most often involves interviewing or a brief survey. With the right approach to this task, it is possible to formulate with a high degree of accuracy what the user needs.

Semantic core: features

To correctly form this basic object for project promotion, you must first understand the nature of user requests. These fall into four major categories:

  • information;
  • transactions;
  • navigation;
  • general requests.

Informational search queries

These people ask the search engine if they have a problem that needs to be solved. The system issues a list of sites that more or less correspond to the given one, after which the client begins to turn through the pages from the top list of results, studying the results for relevance. The person stops when it is possible to find the necessary data.

Most often, information requests begin with a question word, although they often resort to an expression of thought that is relatively non-obvious for machine language - they ask for help or advice, feedback or rules (instructions). If the owner of the resource knows which queries most often lead the user to him (or could lead), it is necessary to form a semantic core for each page, taking into account this information. If the project is non-commercial, then it is information requests that bring almost the entire volume of traffic. To monetize the site, you can resort to contextual advertising or other similar features.

Navigation and transactions

Navigational - these are queries that give a clear description of the virtual page. It is thanks to them that transitions will be made in the future.

Transactions, according to many SEO experts, are the most curious category of all possible requests. Through them, you can get an idea for what purposes the client is looking for a site. Some need material to review, others download files, others make purchases. Knowing the features of transactional requests, you can build your own business on the Internet. By the way, some time ago it was through them that almost everything that offered services, websites, and virtual stores developed.

Feature of the question

Not everything is so easy and simple. The queries that an SEO specialist can identify, making up the semantic core, are used by all competitors. On the one hand, their use cannot guarantee the success of the promotion program - there are too many rivals. At the same time, their absence makes the site development program almost impossible. By applying competitive queries, you can successfully attract the audience to the promoted page. If promotion is supposed to be based on just such requests, it is necessary to control that the user, once on the page, can make the corresponding transaction.

Not everyone is sure whether it is worth using this type of request if the page is being promoted not of a commercial, but of an informational nature. Experts assure that this is an absolutely correct decision. In this case, it is necessary to provide for the possibility for the user to perform actions on the page. The simplest option is contextual advertising that matches the content, an affiliate program.

General requests

These are such formulations from which it is difficult to understand what exactly the user is looking for. For example, it can be "car engine" or "blush brush". For what reason the user is looking for information, only from the request itself is not at all clear. Someone is interested in how the item is arranged and what it is made of, another is looking for an opportunity to buy, and the third is exploring the range of offers on the market. Perhaps the user wants to find instructions for making an item or doing a job on their own, but another person is interested in ordering a service - for example, wallpapering a room. It is necessary to take into account common requests when forming the contextual core, but you should not put special emphasis on them if the project is not dedicated, for example, to all possible types of brushes for blush or wallpaper and everything related to it, from production issues to coloring rules.

Frequency: competition at every turn!

The characteristic of frequency is one of the key ones when choosing the appropriate content of the semantic core. In the general case, all requests are divided into three large groups, while the low frequency includes those that enter the search engine less than two hundred times a month, the questions requested more than a thousand times are classified as high frequency, and the average level is everything between the specified boundaries.

These values ​​are general, for each specific area they will be unique, the numbers vary significantly. To correctly form the semantic core, you need not only to know the indicators of the search engine for the queries that are supposed to be included, but also to represent the hierarchical structure of the site being developed, to work out internal optimization. Experts recognize Yandex.Wordstat as one of the most useful modern tools for forming a semantic core. It helps to identify the frequency of requests, on the basis of which you can make an extended list and get rid of unnecessary, empty requests. To create a structure, it is recommended to make at least three cycles of working with the list of queries when using the capabilities of Yandex.Wordstat.


Wikimedia Foundation. 2010 .

See what "Semantics (linguistics)" is in other dictionaries:

    A branch of semiotics and logic that studies the relation of linguistic expressions to designated objects and expressed content. Semantic issues were discussed in antiquity, but only at the turn of the 19th and 20th centuries. in the writings of C. Pierce, F. de Saussure, C. ... ... Philosophical Encyclopedia

    cognitive linguistics- LINGUISTICS COGNITIVE direction in linguistics, established in the 70s. 20th century and gained wide popularity in the USA and in Europe. Many authors (mainly in the USA) also use the name "cognitive grammar", due to ... ...

    In programming, a system of rules for the interpretation of individual language constructs. Semantics determines the semantic meaning of sentences in an algorithmic language. In English: Semantics See also: Programming languages ​​Financial Dictionary Finam. ... ... Financial vocabulary

    semantics of possible worlds- SEMANTICS OF POSSIBLE WORLDS - a set of semantic constructions for the truth interpretation of non-classical (non-Boolean) logical connectives, the main feature of which is the introduction of the so-called possible worlds into consideration. ... ... Encyclopedia of Epistemology and Philosophy of Science

    Linguistics, linguistics Dictionary of Russian synonyms. linguistics, see linguistics Dictionary of synonyms of the Russian language. Practical guide. M.: Russian language. Z. E. Alexandrova ... Synonym dictionary

    Semasiology, meaning, meaning Dictionary of Russian synonyms. noun semantics, number of synonyms: 8 meaning (27) ... Synonym dictionary

    This term has other meanings, see Semantics (meanings). ... Wikipedia

    Linguistics ... Wikipedia

    Semantics- (from the Greek σημαντικός meaning) 1) all content, information transmitted by the language or any of its units (word, grammatical form of the word, phrase, sentence); 2) a section of linguistics that studies this content, information; … Linguistic Encyclopedic Dictionary

    Text linguistics- - the direction of linguistics, within which the problems of the text are formulated and solved (see text). Text linguistics is a direction of linguistic research, the object of which is the rules for constructing a coherent text and its semantic ... ... Stylistic encyclopedic dictionary of the Russian language


The growing attention of linguists in the second half of the 20th century attract problems related to the study of the semantic side of the language. By the 70s. dissatisfaction has accumulated with the long-term orientation of research in the mainstream of descriptive linguistics (especially its distributive course) and generative linguistics towards the description of language, ignoring meaning. The recognition of the insufficient adequacy of the traditional approach to linguistic meaning, identifying it with universal and unchanging concepts (when following the principles of the old logic) or with changeable ideas (when referring to the principles of psychology), has become common. The limitations of the semantic ideas of G. Paul and M. Breal, who singled out historical changes in the meanings of words as the subject of analysis, were recognized. Many linguists refused to accept the behaviorist interpretation of meaning (L. Bloomfield) as one or another physical object or action localized in an extralinguistic series. The opinion began to be asserted that linguistic semantics is not reduced only to semasiology (lexical semantics) and that the meaning of the sentence (and text) should also be its object.
At first, linguistic semantics developed rapidly as a structural lexicology (and structural lexical semantics) due to the interest of structuralists (or those influenced by their ideas and methods of analysis) in systemic relationships between lexical units (and lexical meanings), which found shape in the form of established independently of each other the theory of lexical (semantic, lexico-semantic) fields and the method of component analysis of the meanings of a group of interrelated words, going back to the oppositional analysis used in phonology (and then morphology). Following this, syntactic semantics arose, which quickly occupied a leading position in linguistic semantics. Its formation was provided by the following incentives: a) first of all, the promotion of generative transformational linguistics to a priority position in the language system of a sentence interpreted in a dynamic (procedural) aspect; b) strong influence (partly mediated by generative linguistics, but largely direct) from the new (formal, relational) logic, especially such sections of it as predicate calculus, semantic logic, modal logic, etc.) ; c) advances in computer science, automatic translation, automatic text processing, artificial intelligence; d) the impact of research results in text linguistics, functional syntax, everyday language philosophy, speech act theory, activity theory, ethnolinguistics, speech ethnography, conversational analysis, discourse analysis, sociolinguistics, psycholinguistics, etc. (reviews of the history of the formation of different areas of semantic thought: John Lyons, 1977; Lev Gennadievich Vasiliev, 1983; review of modern trends in syntactic semantics: Valentin Vasilievich Bogdanov, 1996).
In line with the Chomskian generative transformational grammar, an interpretive semantics has developed (N. Chomsky, J. Katz, Paul Postal, Jerry A. Fodor, Ray S. Jackendoff). Their works describe the work of the semantic component, which assigns meanings to individual elements of the deep structure and derives the meaning of the sentence as a whole based on special projection rules; description of the meanings of elementary symbols in terms of semantic features (meaning atoms); presentation of the sentence as a two-vertex structure (in accordance with the grammar of phrase structures); movement from a formal structure to a semantic one (in accordance with the principles of constructing logical languages ​​- first in their syntactic part and then in the semantic part). Such a direction of operations does not correspond to the actual sequence of stages in the generation of an utterance by the speaker, which was taken into account in a number of new syntactic-semantic theories.
The following models were opposed to the Chomskian approach:
* original syntactic-semantic model by Uriel Weinreich;
* generative semantics (George Lakoff, James McCauley, James Bruce Ross), which declared the deep structure to be semantic, interpreting it in essence as a propositional one-node structure and giving it the role of a starting point in the generation of a sentence, without strictly delimiting semantic and syntactic rules;
* case grammar (Charles Fillmore), which based the description of the generation process not on the NN model with two vertices, but on the dependency model with one vertex - a predicate verb (as in L. Tenier) and with an additional assignment to each node of a certain semantic role (one of universal deep cases from their limited inventory);
* semantically oriented sentence generation theory by Wallace L. Chafe.
70-80s were marked by the construction of numerous other concepts of syntactic semantics, based on both single-vertex and two-vertex models (in our country, I.A. Melchuk, T.B. Alisova, S.D. Katsnelson, Yu.D. Apresyan, V.G. Gak , N. D. Arutyunova, E. V. Paducheva, I. F. Vardul, G. G. Pocheptsov, I. P. Susov, V. V. Bogdanov, V. B. Kasevich, V. S. Khrakovsky , N.Yu. Shvedova and others). Representatives of the Kalinin / Tver semantic-pragmatic school, combining static and dynamic approaches to semantic analysis or having gone from static to dynamic, obtained interesting results in describing the meaning of a sentence (L.V. Solodushnikova, V.I. Sergeeva (Ivanova), A. Z. Fefilova, S. A. Sukhikh, L. I. Kislyakova, V. S. Grigorieva, N. P. Anisimova, G. P. Palchun, G. L. Drugova, V. I. Troyanov, V. A. Kalmykov, K.L. Rozova).
The description of the semantic structure of a sentence can be focused: a) on the structure of typical ontological situations, b) on the subject-predicate (predicative) structure (N.D. Arutyunova, N.B. Shvedova) and the not always clearly delimited structure “theme - rheme”, c) on the propositional (relational) structure (J. McCauley, J. Lakoff, C. Fillmore, W. Chafe, D. Nielsen, W. Cook, F. Blake, S. Starosta, J. Anderson, R. Shenk, R. Van-Valin and W. Foley, P. Adamets, R. Zimek, Y. D. Apresyan, E. V. Paducheva, V. V. Bogdanov, T. B. Alisova, V. B. Kasevich, V. G. Gak); d) the syntactic structure of the sentence (N.Yu. Shvedova, A.M. Mukhin). The propositional approach is the most developed: the specification of semantic actants (deep cases), the distinction between proposition and mode, the distinction between subject and propositional actants, the hierarchization of actant roles, the description of propositional and non-prepositional ways of proposition verbalization, etc. I. P. Susov (1973) builds a three-stage model (a relational structure oriented to the ontological situation - a relational structure superimposed on it and reflecting the structure of the proposition - modification operations that bind the sentence-statement to the speech situation).
The possibilities of syntactic semantics are expanded by adding a pragmatic aspect (communicative, or illocutionary, goal of the speaker; pragmatic aspects of the presupposition; the model of the addressee built by the speaker; using the principle of speech cooperation, or cooperation, etc.).

The growing attention of linguists in the second half of the 20th century attract problems related to the study of the semantic side of the language. By the 70s. dissatisfaction has accumulated with the long-term orientation of research in the mainstream of descriptive linguistics (especially its distributive course) and generative linguistics towards the description of language, ignoring meaning. The recognition of the insufficient adequacy of the traditional approach to linguistic meaning, identifying it with universal and unchanging concepts (when following the principles of the old logic) or with changeable ideas (when referring to the principles of psychology), has become common. The limitations of the semantic ideas of G. Paul and M. Breal, who singled out historical changes in the meanings of words as the subject of analysis, were recognized. Many linguists refused to accept the behaviorist interpretation of meaning (L. Bloomfield) as one or another physical object or action localized in an extralinguistic series. The opinion began to be asserted that linguistic semantics is not reduced only to semasiology (lexical semantics) and that the meaning of the sentence (and text) should also be its object.

At first, linguistic semantics developed rapidly as a structural lexicology (and structural lexical semantics) due to the interest of structuralists (or those influenced by their ideas and methods of analysis) in systemic relationships between lexical units (and lexical meanings), which found shape in the form of established independently of each other the theory of lexical (semantic, lexico-semantic) fields and the method of component analysis of the meanings of a group of interrelated words, going back to the oppositional analysis used in phonology (and then morphology). Following this, syntactic semantics arose, which quickly occupied a leading position in linguistic semantics. Its formation was provided by the following incentives: a) first of all, the promotion of generative transformational linguistics to a priority position in the language system of a sentence interpreted in a dynamic (procedural) aspect; b) strong influence (partly mediated by generative linguistics, but largely direct) from the new (formal, relational) logic, especially such sections of it as predicate calculus, semantic logic, modal logic, etc.); c) advances in computer science, automatic translation, automatic text processing, artificial intelligence; d) the impact of research results in text linguistics, functional syntax, everyday language philosophy, speech act theory, activity theory, ethnolinguistics, speech ethnography, conversational analysis, discourse analysis, sociolinguistics, psycholinguistics, etc. (reviews of the history of the formation of different areas of semantic thought: John Lyons, 1977; Lev Gennadievich Vasiliev, 1983; review of modern trends in syntactic semantics: Valentin Vasilievich Bogdanov, 1996).

In line with the Chomskian generative transformational grammar, an interpretive semantics has developed (N. Chomsky, J. Katz, Paul Postal, Jerry A. Fodor, Ray S. Jackendoff). Their works describe the work of the semantic component, which assigns meanings to individual elements of the deep structure and derives the meaning of the sentence as a whole based on special projection rules; description of the meanings of elementary symbols in terms of semantic features (meaning atoms); presentation of the sentence as a two-vertex structure (in accordance with the grammar of phrase structures); movement from a formal structure to a semantic one (in accordance with the principles of constructing logical languages ​​- first in their syntactic part and then in the semantic part). Such a direction of operations does not correspond to the actual sequence of stages in the production of an utterance by the speaker, which was taken into account in a number of new syntactic-semantic theories.

The following models were opposed to the Chomskian approach:

* original syntactic-semantic model by Uriel Weinreich;

* generative semantics (George Lakoff, James McCauley, James Bruce Ross), which declared the deep structure to be semantic, interpreting it in essence as a propositional one-vertex structure and giving it a starting role in sentence generation, without strictly distinguishing between semantic and syntactic rules;

* case grammar (Charles Fillmore), which based the description of the generation process not on the NN model with two vertices, but on the dependency model with one vertex - a predicate verb (as in L. Tenier) and with an additional assignment to each node of a certain semantic role (one of universal deep cases from their limited inventory);

* semantically oriented sentence generation theory by Wallace L. Chafe.

70-80s were marked by the construction of numerous other concepts of syntactic semantics, based on both single-vertex and two-vertex models (in our country, I.A. Melchuk, T.B. Alisova, S.D. Katsnelson, Yu.D. Apresyan, V.G. Gak , N. D. Arutyunova, E. V. Paducheva, I. F. Vardul, G. G. Pocheptsov, I. P. Susov, V. V. Bogdanov, V. B. Kasevich, V. S. Khrakovsky, N .Yu. Shvedova and others). Representatives of the Kalinin / Tver semantic-pragmatic school, combining static and dynamic approaches to semantic analysis or having gone from static to dynamic, obtained interesting results in describing the meaning of a sentence (L.V. Solodushnikova, V.I. Sergeeva (Ivanova), A. Z. Fefilova, S. A. Sukhikh, L. I. Kislyakova, V. S. Grigorieva, N. P. Anisimova, G. P. Palchun, G. L. Drugova, V. I. Troyanov, V. A. Kalmykov, K.L. Rozova).

The description of the semantic structure of a sentence can be focused: a) on the structure of typical ontological situations, b) on the subject-predicate (predication) structure (N.D. Arutyunova, N.B. Shvedova) and not always clearly delimited from it structure “theme - rheme”, c) on the propositional (relational) structure (J. McCauley, J. Lakoff, Ch. Fillmore, W. Chafe, D. Nielsen, W. Cook, F. Blake, S. Starosta, J. Anderson, R. Shenk, R. Van-Valin and W. Foley, P. Adamets, R. Zimek, Y. D. Apresyan, E. V. Paducheva, V. V. Bogdanov, T. B. Alisova, V. B. Kasevich, V. G. Gak); d) the syntactic structure of the sentence (N.Yu. Shvedova, A.M. Mukhin). The propositional approach is the most developed: the specification of semantic actants (deep cases), the distinction between proposition and mode, the distinction between subject and propositional actants, the hierarchization of actant roles, the description of propositional and non-prepositional ways of proposition verbalization, etc. I. P. Susov (1973) builds a three-stage model (a relational structure oriented to the ontological situation - a relational structure superimposed on it and reflecting the structure of the proposition - modification operations that bind the sentence-statement to the speech situation).

The possibilities of syntactic semantics are expanded by adding a pragmatic aspect (the communicative or illocutionary goal of the speaker; pragmatic aspects of presupposition; the model of the addressee built by the speaker; the use of the principle of speech cooperation, or cooperation, etc.).

I.P. Susov. History of linguistics - Tver, 1999