What is a system of vectors. Linear dependence and independence, properties, study of a system of vectors for linear dependence, examples and solutions

The most important concept in the theory of linear spaces is the linear dependence of vectors. Before defining this concept, let's look at a few examples.

Examples. 1. Given the following system of three vectors from the space Tk:

It is easy to see that either

2. Let us now take another system of vectors from

It is difficult to see a relation similar to equality (1) for this system of vectors. However, it is easy to check that

The coefficients 4, -7.5 of relation (2) could be found as follows. Let us denote them as unknown, we will solve the vector equation:

Having performed the indicated operations of multiplication and addition and passing to the equality of the vector components in (2), we obtain a homogeneous system of linear equations with respect to

One solution to this system is:

3. Consider the system of vectors:

Equality

leads to a system of equations that has a unique - zero - solution. (Check!) Thus, from equality (3) it follows,

which In other words, equality (3) is satisfied only for

The vector systems in examples 1-2 are linearly dependent, the system of example 3 is linearly independent.

Definition 3. A system of vectors in a linear space over a field is said to be linearly dependent if there are not all numbers of the field R equal to zero such that

If for vectors equality takes place only at then the system of vectors is called linearly independent.

Note that the property of linear dependence and independence is a property of a system of vectors. However, the same adjectives are widely used in the literature when applied directly to the vectors themselves, and they say, with liberty of speech, "a system of linearly independent vectors" and even "vectors are linearly independent."

If there is only one vector a in the system, then for property 6 (§ 2) it follows from It follows that the system consisting of one nonzero vector is linearly independent. On the contrary, any system of vectors containing zero vector 0 is linearly dependent. For example, if then

If the system of two vectors is linearly dependent, then the equality holds for (or . Then

i.e. the vectors are proportional. The converse is also true, since it follows from. Hence, a system of two vectors is linearly dependent if and only if the vectors are proportional.

The proportional vectors from lie on the same straight line; in connection with this and in the general case, proportional vectors are sometimes called collinear.

We note some properties of the linear dependence of vectors.

Property 1. A system of vectors containing a linearly dependent subsystem is linearly dependent.

Let a linearly dependent subsystem

Then there are not all zero numbers such that

Adding the remaining vectors of the given system with zero coefficients to the left side of this equality, we obtain the required one.

It follows from property 1 that any subsystem of a linearly independent system of vectors is linearly independent.

Property 2. If the system of vectors

is linearly independent, and the system of vectors

is linearly dependent, then the vector is linearly expressed in terms of the vectors of system (4).

Since the system of vectors (5) is linearly dependent, there are not all numbers equal to zero such that

If then and then non-zero coefficients will be among which would mean a linear dependence of system (4). Hence, and

Property 3. Ordered system of nonzero vectors

is linearly dependent if and only if some vector is a linear combination of previous vectors.

Let the system be linearly dependent. Because the vector is linearly independent. Denote by the smallest natural number for which the system is linearly dependent. (This exists: in the extreme case, if the systems are linearly independent, then Then there are not all numbers equal to zero such that the equality

If then non-zero coefficients would be among and the equality would hold

which would mean a linear dependence of the system, but this would contradict the choice of number. Hence, and therefore

Conversely, from equality (7) property 1 implies a linear dependence of the system

Property 3 easily implies that a system of vectors is linearly dependent if and only if at least one of its vectors is linearly expressed in terms of the others. In this sense, they say that the concept of linear dependence is equivalent to the concept of linear expressibility.

Property 4. If the vector x is linearly expressed in terms of the vectors of the system

and the vector is linearly expressed in terms of the remaining vectors of system (8), then the vector is also linearly expressed in terms of these vectors of system (8).

Indeed,

Now we can prove one of the most important theorems about the linear dependence of vectors.

Theorem 1. If each vector of a linearly independent system

is a linear combination of vectors

then In other words, in a linearly independent system of vectors that are linear combinations of vectors, the number of vectors cannot be more than

Proof. 1st step. Let's build a system

By assumption, each vector of system (9), in particular, the vector is linearly expressed in terms of vectors (10), and therefore system (11) is linearly dependent. By property 3 in system (11), some vector where is linearly expressed in terms of the previous vectors, and therefore also in terms of the vectors of the system

obtained from (11) by deleting the vector From here, by property 4, we have: each vector of system (9) is linearly expressed in terms of the vectors of system (12).

2nd step. Applying the same reasoning as in the step to systems of vectors

and (12) and taking into account that the system of vectors is linearly independent, we obtain a system of vectors

through which all vectors of system (9) are linearly expressed.

If we assume that by continuing this process, we will exhaust all vectors through steps and get the system

such that each vector of system (9), in particular, is linearly expressed in terms of the vectors of system (14). Then system (9) turns out to be linearly dependent, which contradicts the condition. It remains to accept that

Let us now consider what the linear dependence of vectors in different spaces means.

1. Space If a system of two vectors is linearly dependent, then or, ie, the vectors are collinear. The reverse is also true. A system of three space vectors is linearly dependent if and only if they lie in the same plane. (Prove!) The system of four space vectors is always linearly dependent. Indeed, if any subsystem of our system is linearly dependent, then the whole system is linearly dependent. If no eigensubsystem is linearly dependent, then, according to the previous one, this means that no three vectors of our system lie on the same plane. Then it follows from geometric considerations that there are real numbers such that the parallelepiped with edge-vectors will have a diagonal, i.e., in the equality

Let us proceed to the description of the properties of linear spaces. First of all, they include the relationships between its elements.

Linear combination elements over the field of real numbers R called element

Definition. The set of elements , is called linearly independent, if from the equality

it necessarily follows that ,. It is clear that any part of the elements from is also linearly independent. If at least one of, then the set is called linearly dependent.

ExampleIII.6. Let a vector set be given. If one of the vectors is, for example, then such a system of vectors is linearly dependent. Indeed, let the set,, …,,, …, be linearly independent, then it follows from equality that.

Adding to this set the vector multiplied by, we still have the equality

Therefore, the set of vectors, as well as any other elements containing a zero element, is always linearly dependent ▼.

Comment. If the set of vectors is empty, then it is linearly independent. Indeed, if there are no indices, then it is impossible to choose the corresponding non-zero numbers for them so that the sum of the form (III.2) is equal to 0. Such an interpretation of linear independence can be taken as a proof, especially since such a result agrees well with the theory 11.

In connection with the above, the definition of linear independence can be formulated as follows: a set of elements is linearly independent if and there is no index for which. In particular, this set can also be empty.

ExampleIII.7. Any two sliding vectors are linearly dependent. Recall that sliding vectors are vectors that lie on one straight line. Taking a unit vector, you can get any other vector by multiplying by the corresponding real number, that is, or. Therefore, already any two vectors in one-dimensional space are linearly dependent.

ExampleIII.8. Consider the space of polynomials, where ,,,. Let's write down

Assuming ,,, we obtain, identically in t

that is, the set is linearly dependent. Note that any finite set of the form , is linearly independent. For proof, consider the case, then from the equality

in the case of the assumption of its linear dependence, it would follow that there are not all numbers equal to zero 1 , 2 , 3 , which is identical for any (III.3), but this contradicts the fundamental theorem of algebra: any polynomial n-th degree has no more than n real roots. In our case, this equation has only two roots, and not an infinite number of them. We got a contradiction.

§ 2. Linear combinations. bases

Let be . We will say that there linear combination elements .

TheoremIII.1 (main). The set of nonzero elements is linearly dependent if and only if some element is a linear combination of the preceding elements.

Proof. Need. Suppose that the elements ,, …, are linearly dependent and let be the first natural number for which the elements ,, …, are linearly dependent, then

for not all equal to zero and necessarily (otherwise this coefficient would be, which would contradict the stated). Hence we have a linear combination

Adequacy is obvious because every set containing a linearly dependent set is itself linearly dependent ▼.

Definition. Basis (coordinate system) of a linear space L is called a set A linearly independent elements, such that each element from L is a linear combination of elements from A, 11.

We will consider finite-dimensional linear spaces ,.

ExampleIII.9. Consider a three-dimensional vector space . Take unit vectors,,. They form the basis for

Let us show that the vectors are linearly independent. Indeed, we have

or . From here, according to the rules of multiplication of a vector by a number and addition of vectors (Example III.2), we obtain

Therefore, ,,▼.

Let be an arbitrary space vector; then, based on the linear space axioms, we obtain

Similar reasoning is valid for a space with a basis, . It follows from the main theorem that in an arbitrary finite-dimensional linear space L any element can be represented as a linear combination of its basic elements,, ...,, i.e.

Moreover, such a decomposition is unique. Indeed, let us have

then after subtraction we get

Hence, due to the independence of the elements ,,

That is ▼.

TheoremIII.2 (on addition to the basis). Let be a finite-dimensional linear space and be some set of linearly independent elements. If they do not form a basis, then it is possible to find such elements,, ...,, that the set of elements form a basis in. That is, each linearly independent set of elements of a linear space can be completed to a basis.

Proof. Since the space is finite-dimensional, it has a basis consisting, for example, of n elements, let these be elements. Consider a set of elements.

Let's apply the main theorem. In the order of the elements, consider the set A. It is obviously linearly dependent, since any of the elements is a linear combination,,. Since the elements,, ..., are linearly independent, then adding elements to it sequentially until the first element appears, for example, such that it is a linear combination of the previous vectors of this set, that is. Removing this element from the set A, we get . We continue this procedure until this set contains n linearly independent elements, among which all elements ,, …, and n-m from elements. The resulting set will be the basis ▼.

ExampleIII.10. Prove that the vectors ,, and form a linearly dependent set, and any three of them are linearly independent.

Let us show that there are not all zero numbers for which

Indeed, for , we have

Linear dependence is proven. Let us show that a triple of vectors, for example ,,, forms a basis. Let's make an equality

Performing actions with vectors, we get

Equating the corresponding coordinates in the right and left parts of the last equality, we get the system of equations ,,, solving it, we get.

A similar reasoning is valid for the remaining triples of vectors ,, or ,,.

TheoremIII.3 (on the dimension of space). All bases of a finite-dimensional linear space L consist of the same number of basic elements.

Proof. Let two sets be given, where;,. We assign to each of them one of two properties that determine the basis: 1) through the elements of the set A any elements from L, 2) elements of the set B represent a linearly independent set, but not necessarily all of them. L. We will assume that the elements A and B ordered.

Consider the set A and apply to its elements m times the method from the main theorem. Since the elements from B are linearly independent, then we obtain, as before, a linearly dependent set

Indeed, if , then we would get a linearly independent set, and the remaining n set elements B would be linearly expressed through them, which is impossible, which means . But this also cannot be, since by construction the set (III.4) has the property of the basis of the set A. Because space L finite-dimensional, then only , that is, two different bases of the space L consist of the same number of elements ▼.

Consequence. In any n-dimensional linear space () you can find infinitely many bases.

Proof follows from the rule of multiplication of elements of a linear (vector) space by a number.

Definition. The dimension of a linear space L is the number of elements that make up its basis.

It follows from the definition that the empty set of elements - a trivial linear space - has dimension 0, which, as it should be noted, justifies the terminology of linear dependence and allows us to state: n-dimensional space has dimension n, .

Thus, summing up what has been said, we obtain that each set of n+1 item n-dimensional linear space is linearly dependent; set of n elements of a linear space is a basis if and only if it is linearly independent (or each element of the space is a linear combination of elements of its basis); in any linear space, the number of bases is infinite.

ExampleIII.11 (Kronecker–Cappelli theorem).

Let we have a system of linear algebraic equations

where A – matrix of coefficients of the system,  extended matrix of coefficients of the system

Where , (III.6)

this notation is equivalent to the system of equations (III.5).

TheoremIII.4 (Kronecker - Capelli). The system of linear algebraic equations (III.5) is consistent if and only if the rank of the matrix A is equal to the rank of the matrix , that is.

Proof.Need. Let system (III.5) be consistent, then it has a solution: ,,. Considering (III.6), , but in this case there is a linear combination of vectors,, …,. Therefore, through the set of vectors,,, ..., one can express any vector from. It means that.

Adequacy. Let be . We choose any basis from ,, …,, then it is linearly expressed through the basis (it can be both all vectors, and their part) and thus, through all vectors,. This means that the system of equations is consistent ▼.

Consider n-dimensional linear space L. Each vector can be represented as a linear combination , where the set consists of basis vectors. We rewrite the linear combination in the form and establish a one-to-one correspondence between the elements and their coordinates

This means that between n-dimensional linear vector space of vectors over n-dimensional field of real numbers established a one-to-one correspondence.

Definition. Two linear spaces and over the same scalar field isomorphic if it is possible to establish a one-to-one correspondence between their elements f, so that

that is, an isomorphism is understood as a one-to-one correspondence that preserves all linear relations. It is clear that isomorphic spaces have the same dimension.

It follows from the example and the definition of isomorphism that from the point of view of studying the problems of linearity, isomorphic spaces are the same, therefore, formally instead ofn-dimensional linear spaceLabove the field, only the field can be studied.

Task 1. Find out if the system of vectors is linearly independent. The system of vectors will be defined by the matrix of the system, the columns of which consist of the coordinates of the vectors.

Decision. Let the linear combination be equal to zero. Having written this equality in coordinates, we obtain the following system of equations:

Such a system of equations is called triangular. It has only one solution. Therefore, the vectors are linearly independent.

Task 2. Find out if the system of vectors is linearly independent.

Decision. The vectors are linearly independent (see Problem 1). Let's prove that the vector is a linear combination of vectors . The expansion coefficients in vectors are determined from the system of equations

This system, like a triangular one, has a unique solution.

Therefore, the system of vectors is linearly dependent.

Comment. Matrices such as in problem 1 are called triangular , and in problem 2 – stepped triangular . The question of the linear dependence of a system of vectors is easily solved if the matrix composed of the coordinates of these vectors is stepwise triangular. If the matrix does not have a special form, then using elementary string transformations , preserving linear relationships between columns, it can be reduced to a stepped triangular form.

Elementary string transformations matrices (EPS) are called the following operations on the matrix:

1) permutation of lines;

2) multiplying a string by a non-zero number;

3) adding to the string another string, multiplied by an arbitrary number.

Task 3. Find the maximum linearly independent subsystem and calculate the rank of the system of vectors

Decision. Let us reduce the matrix of the system with the help of EPS to a stepped-triangular form. To explain the procedure, the line with the number of the matrix to be transformed will be denoted by the symbol . The column after the arrow shows the actions to be performed on the rows of the converted matrix to obtain the rows of the new matrix.

Obviously, the first two columns of the resulting matrix are linearly independent, the third column is their linear combination, and the fourth does not depend on the first two. The vectors are called basic. They form the maximum linearly independent subsystem of the system , and the rank of the system is three.



Basis, coordinates

Task 4. Find the basis and coordinates of vectors in this basis on the set of geometric vectors whose coordinates satisfy the condition .

Decision. The set is a plane passing through the origin. An arbitrary basis on the plane consists of two non-collinear vectors. The coordinates of the vectors in the selected basis are determined by solving the corresponding system of linear equations.

There is another way to solve this problem, when you can find the basis by coordinates.

The space coordinates are not coordinates on the plane, since they are related by the relation, that is, they are not independent. The independent variables and (they are called free) uniquely determine the vector on the plane and, therefore, they can be chosen as coordinates in . Then the basis consists of vectors lying in and corresponding to sets of free variables and , that is, .

Task 5. Find the basis and coordinates of the vectors in this basis on the set of all vectors in the space , whose odd coordinates are equal to each other.

Decision. We choose, as in the previous problem, coordinates in space .

Since , the free variables uniquely determine the vector from and, therefore, are coordinates. The corresponding basis consists of vectors .

Task 6. Find the basis and coordinates of vectors in this basis on the set of all matrices of the form , where are arbitrary numbers.

Decision. Each matrix from can be uniquely represented as:

This relation is the expansion of the vector from in terms of a basis with coordinates .

Task 7. Find the dimension and basis of the linear span of a system of vectors

Decision. Using the EPS, we transform the matrix from the coordinates of the system vectors to a stepped-triangular form.

The columns of the last matrix are linearly independent, and the columns are linearly expressed through them. Therefore, the vectors form a basis , and .

Comment. The basis in is chosen ambiguously. For example, vectors also form a basis.

Let be a field of scalars and F be its base set. Let - -dimensional arithmetic space over - arbitrary system of space vectors

DEFINITION. A linear combination of a system of vectors is a sum of the form where . The scalars are called the coefficients of the linear combination. A linear combination is called nontrivial if at least one of its coefficients is nonzero. A linear combination is called trivial if all its coefficients are equal to zero.

DEFINITION. The set of all linear combinations of vectors of a system is called the linear span of this system and is denoted by . The linear span of an empty system is considered to be a set consisting of a zero vector.

So, by definition,

It is easy to see that the linear span of this system of vectors is closed under the operations of adding vectors, subtracting vectors, and multiplying vectors by scalars.

DEFINITION. A system of vectors is called linearly independent if, for any scalars, equalities follow from equality. Empty vector system

considered to be linearly independent.

In other words, a finite system of vectors is linearly independent if and only if any non-trivial linear combination of vectors in the system is not equal to a zero vector.

DEFINITION. A system of vectors is said to be linearly dependent if there are scalars not all equal to zero such that

In other words, a finite system of vectors is called linearly dependent if there exists a non-trivial linear combination of the system's vectors equal to the zero vector.

Vector system

is called the system of unit vectors of the vector space. This system of vectors is linearly independent. Indeed, for any scalars equality implies equality, and hence the equalities

Consider the properties of linear dependence and independence of a system of vectors.

PROPERTY 1.1. The system of vectors containing the zero vector is linearly dependent.

Proof. If in the system of vectors one of the vectors, for example, the vector is zero, then the linear combination of the vectors of the system, all coefficients of which are zero, except for the coefficient at, is equal to the zero vector. Therefore, such a system of vectors is linearly dependent.

PROPERTY 1.2. A system of vectors is linearly dependent if any of its subsystems is linearly dependent.

Proof. Let be a linearly dependent subsystem of the system, and let at least one of the coefficients be nonzero. Then Consequently, the system of vectors is linearly dependent.

CONSEQUENCE. Any subsystem of a linearly independent system is linearly independent.

PROPERTY 1.3. Vector system

in which is linearly dependent if and only if at least one of the vectors is a linear combination of the previous vectors.

Proof. Let system (1) be linearly dependent and Then there exist scalars not all equal to zero such that

Denote by k the largest of the numbers satisfying the condition. Then equality (2) can be written as

Note that for otherwise, therefore, since . From (3) follows the equality

Let us now assume that the vector is a linear combination of the vectors preceding it, i.e. Then , i.e., the subsystem of system (1) is linearly dependent. Therefore, by property 1.2, the original system (1) is also linearly dependent.

PROPERTY 1.4. If the system of vectors is linearly independent, and the system of vectors

is linearly dependent, then the vector v is linearly expressed in terms of the vectors

and in a unique way.

Proof. By assumption, system (2) is linearly dependent, i.e., there are scalars not all equal to zero, such that

Moreover, since at which contradicts the linear independence of system (1). From (3) follows the equality

By virtue of the linear independence of system (1), this implies that

PROPERTY 1.5. If

Proof. The condition means that there are scalars such that

The condition means that there are scalars such that

By virtue of (1) and (2) we obtain

THEOREM 1.2. If a

then the system of vectors is linearly dependent. Proof (carried out by induction on ).

The system of vectors is called linearly dependent, if there are such numbers , among which at least one is different from zero, that the equality https://pandia.ru/text/78/624/images/image004_77.gif" width="57" height="24 src=" >.

If this equality holds only if all , then the system of vectors is called linearly independent.

Theorem. The system of vectors will linearly dependent if and only if at least one of its vectors is a linear combination of the others.

Example 1 The polynomial is a linear combination of polynomials https://pandia.ru/text/78/624/images/image010_46.gif" width="88 height=24" height="24">. Polynomials constitute a linearly independent system, since the https polynomial ://pandia.ru/text/78/624/images/image012_44.gif" width="129" height="24">.

Example 2 The matrix system , , https://pandia.ru/text/78/624/images/image016_37.gif" width="51" height="48 src="> is linearly independent, since the linear combination is equal to the zero matrix only in when https://pandia.ru/text/78/624/images/image019_27.gif" width="69" height="21">, , https://pandia.ru/text/78/624 /images/image022_26.gif" width="40" height="21"> linearly dependent.

Decision.

Let's make a linear combination of these vectors https://pandia.ru/text/78/624/images/image023_29.gif" width="97" height="24">=0..gif" width="360" height=" 22">.

Equating the same-named coordinates of equal vectors, we get https://pandia.ru/text/78/624/images/image027_24.gif" width="289" height="69">

Finally we get

The system has a unique trivial solution, so the linear combination of these vectors is zero only if all coefficients are zero. Therefore, this system of vectors is linearly independent.

Example 4 The vectors are linearly independent. What will be the systems of vectors

Decision.

a). Compose a linear combination and equate it to zero

Using the properties of operations with vectors in a linear space, we rewrite the last equality in the form

Since the vectors are linearly independent, the coefficients for must be equal to zero, i.e..gif" width="12" height="23 src=">

The resulting system of equations has a unique trivial solution .

Since equality (*) executed only at https://pandia.ru/text/78/624/images/image031_26.gif" width="115 height=20" height="20"> – linearly independent;

b). Compose the equality https://pandia.ru/text/78/624/images/image039_17.gif" width="265" height="24 src="> (**)

Applying similar reasoning, we get

Solving the system of equations by the Gauss method, we obtain

The last system has an infinite number of solutions https://pandia.ru/text/78/624/images/image044_14.gif" width="149" height="24 src=">. Thus, there is a non-zero set of coefficients for which the equality (**) . Therefore, the system of vectors is linearly dependent.

Example 5 The vector system is linearly independent, and the vector system is linearly dependent..gif" width="80" height="24">.gif" width="149 height=24" height="24"> (***)

In equality (***) . Indeed, for , the system would be linearly dependent.

From the relation (***) we get or Denote .

Tasks for independent solution (in the classroom)

1. A system containing a zero vector is linearly dependent.

2. Single vector system a, is linearly dependent if and only if, a=0.

3. A system consisting of two vectors is linearly dependent if and only if the vectors are proportional (that is, one of them is obtained from the other by multiplying by a number).

4. If a vector is added to a linearly dependent system, then a linearly dependent system is obtained.

5. If a vector is removed from a linearly independent system, then the resulting system of vectors is linearly independent.

6. If the system S linearly independent, but becomes linearly dependent when a vector is added b, then the vector b linearly expressed in terms of the vectors of the system S.

c). The system of matrices , , in the space of matrices of the second order.

10. Let the system of vectors a,b,c vector space is linearly independent. Prove the linear independence of the following systems of vectors:

a).a+b, b, c.

b).a+https://pandia.ru/text/78/624/images/image062_13.gif" width="15" height="19">– arbitrary number

c).a+b, a+c, b+c.

11. Let be a,b,c are three vectors in the plane that can be used to form a triangle. Will these vectors be linearly dependent?

12. Given two vectors a1=(1, 2, 3, 4),a2=(0, 0, 0, 1). Pick up two more 4D vectors a3 anda4 so that the system a1,a2,a3,a4 was linearly independent .