The theorem of multiplication of probabilities of independent events. Probability of producing events

It often happens that the probability of some event can be found by knowing the probabilities of other events associated with this event.

The theorem of addition of probabilities.

?Theorem 2.6. (Addition theorem). The probability of the sum (combination; the appearance of one of them, no matter which one) of two arbitrary events is equal to the sum of the probabilities of these events minus the probability of their joint occurrence, i.e. P(A+B) = P(A) + P(B) - P(AB).

Consequence 1. The probability of the sum (union) is pairwise non- joint events is equal to the sum of their probabilities, i.e. P(A 1 +A 2 +...+A n) = = P(A 1) + P(A 2) + ... + P(A n).

Consequence 2. Let be A 1 , A 2 , ... , A n- a complete group of pairwise incompatible events. Then P(A 1)+P(A 2)+ ... +P(A n) = 1.

Consequence 3. The sum of the probabilities of opposite events is equal to one, i.e. P(A) + P(`A) = 1.

Example 2.10. An urn contains 5 white, 6 black and 9 red balls. What is the probability that the first ball drawn at random is black or red?

Solution. There are only 20 elementary outcomes here, of which 6 favors the appearance of a black ball, and 9 favors the appearance of a red one. Therefore, the probability of coexistence A- the appearance of a black ball: P(A) = 6/20, and the probability of the event B- the appearance of the red ball: P(A) = 9/20. Because events A And B are incompatible (only one ball is taken out), then P(A+B) = P(A) + P(B) = 6/20 + 9/20 = 0,75. Answer: 0,75.

? Conditional probability of event B (P A (B)) - probability of event B, calculated assuming that event A has already occurred. If A And B are independent events, then P A(B) = P(B), PB(A) = P(A).

Probability multiplication theorem.

?Theorem 2.7. (Probability multiplication theorem). The probability of the product (intersection; joint occurrence) of two arbitrary events is equal to the product of the probability of one of them by the conditional probability of the other, calculated under the condition that the first event has already occurred, i.e. P(AB) = P(AP A(B) = P(BPB(A).

Example 2.11. There are 11 popular science books and 5 art books on the shelf. What is the probability that two randomly selected books in a row will be fiction?

Solution. Consider two events B 1 and B 2: B 1 - at the first test, an art book was taken, B 2 - at the second test, an artistic book was taken. By Theorem 2.7, the probability of such an event is equal to P(B 1 B 2)=P(B one)· PB 1 (B 2). Event Probability B 1 P(B 1) = 5/16. After the first test, 15 books will remain on the shelf, of which 4 are artistic, so the conditional probability PB 1 (B 2) = 4/15. Hence the desired probability is: P(B 1 B 2) = . Answer: 1/12.


Consequence 1. The probability of the joint occurrence of several events is equal to the product of the probability of one of them by the conditional probabilities of all the others, and the probability of each subsequent event is calculated under the condition that all previous events have already occurred, i.e. P(A one · A 2 ·...· A n) = P(A one)· P A 1 (A 2) P A 1A 2 (A 3). · ... · P A 1 A 2… An -1 (A n).

Example 2.12. The word "MATHEMA-TIKA" is composed of ten cards. Of these, the schoolboy nau-cottage chooses four cards in turn and puts one to the other. What is the probability that the word "THEMA" will appear?

Solution. Let's introduce events A 1 , A 2 , A 3 , A 4, consisting in the fact that the first letter chosen is T, the second is E, the third is M and the fourth is A. We need to find the probability of producing these events. By Corollary 1 from Theorem 2.7 we have:

P(A one · A 2 · A 3 · A 4) = P(A one)· P A 1 (A 2) P A 1A 2 (A 3) P A 1A 2A 3 (A 4) = Answer: 1/420.

Consequence 2. If A 1 ,A 2 ,...,A n- independent events, then the probability of their product (joint occurrence) is equal to the product of the probabilities of these events, i.e. P(A one · A 2 · ... · A n) = P(A one)· P(A 2) · ... · P(A n).

Example 2.13. Two shooters, independently of one another, fire one shot at the same target. The probability of hitting the target by the first shooter is 0.7, by the second - 0.8. What is the probability that the target will be hit?

Solution. Let the event BUT is that the target was hit by the first shooter, and event B is that the target was hit by the second shooter. By condition R(BUT) = 0.7 and R(IN) =0,8.

1st way. Consider opposite events: `A- miss the first shooter, `B- second miss. By Corollary 3, from Theorem 2.6 we obtain R(`A) = 1-0.7 = 0.3 and R(`B) = 1-0.8 = 0.2. Product of events `A `B means both shooters miss. According to the meaning of the task of the event BUT And IN are independent, and therefore opposite co-existence `A And `B will also be independent. By Corollary 2 from Theorem 2.7, we obtain the probability that both arrows will miss: P(`A `B) = 0.3 0.2 = 0.06. We are interested in the probability of the opposite event, which is that the target is hit. Therefore, we find the desired probability by Corollary 3 of Theorem 2.6: 1 - 0.06 = 0.94.

2nd way. The desired event (the target will be hit by at least one shooter) is the sum of events A And B. By Theorem 2.6. P(A+B) = P(A) + P(B) - P(AB) = 0.7 + 0.8 - 0.7 0.8 = 1.5 - 0.56 = 0.94. Answer: 0,94.

Example 2.14. IN student group 25 people. What is the probability that at least two people have the same birthday?

Solution. The probability that the birthdays of two randomly taken people are the same is equal to 1/365 (we assume that the occurrence of a birthday on any day of the year is equally likely). Then the probability that the birthdays of two people do not coincide, i.e. the probability of the opposite event is 1-1/365 = 364/365. The probability that the birthday of the third is different from the birthdays of the previous two will be 363/365 (363 cases out of 365 favor this event). Arguing similarly, we find that for the 25th member of the group, this probability is 341/365. Next, we find the probability that the birthdays of all 25 members of the group do not coincide. Since all these events (non-coincidence of the birthday of each next member of the group with the birthdays of the previous ones) are independent, then by Corollary 2 from Theorem 2.7 we get:

P(A 2 A 3 ... A 25) = · · ... · » 0.43.

This is the probability that all 25 people have different birthdays. The probability of the opposite event will be the probability that at least two people have the same birthday, i.e. isco-my probability P» 1-0.43 = 0.57. Answer: » 0.57.

Full probability formula.

?Theorem 2.8. Let be B 1 , B 2 , …, B n is a complete group of pairwise non-joint events. Event probability A, which can occur only if one of the events occurs B 1 , B 2 , …, B n, is equal to the sum of the products of the probabilities of each of these events by the corresponding conditional probability of the event A, i.e.

P( A) = P(B one)· PB 1 (A) + P(B 2) PB 2 (A) + … + P(B nP Bn(A).

This formula is called formula full probability . Developments B 1 , B 2 , …, B n satisfying the conditions of Theorem 2.8 are called hypotheses.

Example 2.15. A tourist equally likely chooses one of three routes: horse, water and mountain. The probability that he will successfully overcome the path when choosing a mounted mode of movement is 0.75, when choosing waterway- 0.8, when choosing a mountain route - 0.55. Find the probability that the tourist will successfully complete the entire journey regardless of the choice of route.

Solution. Let's enter the events: A- "The tourist will successfully overcome the entire path with any choice of route", B 1 , B 2 , B 3 - selected, respectively, horse, water and mountain route. Since the choice of a route is equally probable, then the probabilities of choosing each route P(B 1) = P(B 2) = P(B 3) = 1/3. By condition PB 1 (A) = 0,75; PB 2 (A) = 0,8; PB 3 (A) = 0.55. Then according to the total probability formula: P(A) = P(B one)· PB 1 (A) + P(B 2) PB 2 (A) + P(B 3) PB 3 (A) = (1/3) 0.75 + (1/3) 0.8 + (1/3) 0.55 = 0.7.

Answer: 0,7.

?Theorem 2.9. Conditional probability of any hypothesis B i ( i = 1, 2, … ,n) is calculated from Bayes formula:

The Bayes formula allows you to overestimate the probabilities of hypotheses after the result of the test becomes known, as a result of which the event appeared A.

Example 2.16. There are three sets of chips, the first of which contains 100, the second 300 and the third 600 chips. The probability that a chip taken at random from the first set is good is 0.9, and for the second and third sets it is 0.85 and 0.8, respectively. What is the probability that: a) an arbitrarily taken microcircuit is in good order; b) a good microcircuit is taken from the second set?

Solution. a) In this case, there are three hypotheses whose probabilities are P(B 1) = 0,1, P(B 2) = 0,3, P(B 3) = 0.6. Using the total probability formula, we find P(A) = P(B one)· PB 1 (A) + P(B 2) PB 2 (A) + P(B 3) PB 3 (A) = 0.1 0.9 + 0.3 0.85 + 0.6 0.8 = 0.825.

b) Suppose that the desired event A happened - the correct microcircuit was removed. Let's find the probability P A(B 2) the fact that this micro-circuit is extracted from the second set. According to the Bayes formula,

Answer: a) 0.825; b) 17/55.

Example 2.17. Of the 10 students who came to the exam in mathematics, three prepared themselves perfectly, four - well, two - satisfactorily, and one did not prepare at all. There are 20 questions in the tickets. Well-prepared students can answer all 20 questions, well - 16 questions, satisfactory - 10, and unprepared - 5 questions. Each student receives 3 questions out of 20 at random. The student who was invited first answered all 3 questions. What is the probability that he is an excellent student?

P A ( B one). According to the Bayes formula P A(B 1) = » 0.58.

As you can see, the desired probability is relatively small. Therefore, the teacher will have to offer the student a few more additional questions. Answer: 0,58.

The theorem of multiplying the probabilities of two arbitrary events: the probability of the product of two arbitrary events is equal to the product of the probability of one of the events by the conditional probability of the other event, provided that the first has already happened:

P(AB)=P(A)P(B|A) = P(B)P(A|B). (10)

Proof (not rigorous): we will prove the multiplication theorem for the scheme of chances (equiprobable hypotheses). Let the possible outcomes of the experience be n chances. Assume that event A has m chances (shaded in Fig. 11); event B - k chances; simultaneously events A and B (AB) - l chances (in Fig. 11 they have light shading).

Figure 11

Obviously, m+k-l=n. According to the classical way of calculating the probabilities P(AB)=l/n; P(A)=m/n; P(B)=k/n. And the probability is P(B|A)=l/m, since it is known that one of the m chances of event A has occurred, and event B is favored by l similar chances. Substituting these expressions into theorem (10), we obtain the identity l/n=(m/n)(l/m). The theorem has been proven.

The probabilities multiplication theorem for three arbitrary events:

P(ABC)=|AB=D|=P(DC)=P(D)P(C|D)=P(AB)P(C|AB)=P(A)P(B|A)P( C|AB).(11)

By analogy, one can write probability multiplication theorems for more events.

Corollary 1. If event A does not depend on B, then event B does not depend on A either.

Proof. Because event A does not depend on B, then by the definition of independence of events P(A)=P(A|B)=P(A|). It is required to prove that P(B)=P(B|A).

By the multiplication theorem, P(AB)=P(A)P(B|A)=P(B)P(A|B), therefore, P(A)P(B|A)=P(B)P(A ). Assuming that P(A)>0, we divide both sides of the equality by P(A) and get: P(B)=P(B|A).

Corollary 1 implies that two events are independent if the occurrence of one of them does not change the probability of the occurrence of the other. In practice, events (phenomena) that are interconnected by a causal relationship are dependent.

Corollary 2. The probability of the product of two independent events is equal to the product of the probabilities of these events. Those. if events A and B are independent, then

P(AB)=P(A)P(B). (eleven)

The proof is obvious, since for independent events P(B|A)=P(B).

Identity (11), along with expressions (12) and (13), are necessary and sufficient conditions for the independence of two random events A and B.

P(A)=P(A|B); P(A)=P(A|); P(A|B)=P(A|); (12)

P(B)=P(B|A); P(B)=P(B|); P(B|A)=P(B|). (13)

The reliability of some system is increased by double redundancy (see Fig. 12). The probability of failure-free operation of the first subsystem (during some operating time) is 0.9, the second - 0.8. Determine the probability of failure of the system as a whole during a given operating time, if the failures of the subsystems are independent.

Figure 12 - Double redundant system

E: Reliability study of a doubly redundant control system;

A 1 =(non-failure operation (during some operating time) of the first subsystem); P(A1)=0.9;

A 2 =(failsafe operation of the second subsystem); P(A2)=0.8;

A=(failsafe operation of the system as a whole); P(A)=?

Solution. Let us express the event A in terms of the events A 1 and A 2 whose probabilities are known. Since the failure-free operation of at least one of its subsystems is sufficient for the failure-free operation of the system, it is obvious that A=A 1 A 2.

Applying the probability addition theorem, we get: P(A)=P(A 1 A 2)=P(A 1)+P(A 2)-P(A 1 A 2). The probability of the joint occurrence of events A 1 and A 2 is determined by the probability multiplication theorem: P(A 1 A 2)=P(A 1)P(A 2 |A 1). Considering that (by condition) the events A 1 and A 2 are independent, P(A 1 A 2)=P(A 1)P(A 2). Thus, the probability of failure-free operation of the system is P(A)=P(A 1 A 2)=P(A 1)+P(A 2)-P(A 1)P(A 2)=0.9+0, 8-0.90.8=0.98.

Answer: the probability of failure-free operation of the system during a given operating time is 0.98.

Comment. In example 20, another way of defining event A is possible through events A 1 and A 2: , i.e. A system failure is possible when both of its subsystems fail simultaneously. Applying the theorem of multiplication of probabilities of independent events, we obtain the following value of the probability of system failure: . Therefore, the probability of failure-free operation of the system during a given operating time is equal to.

Example 21 (paradox of independence)

E: Two coins are tossed.

A=(coat of arms loss on the first coin), P(A)=0.5;

B=(coat of arms loss on the second coin), P(B)=0.5;

C=(coat of arms appearing on only one of the coins), P(C)=0.5.

Events A, B and C are pairwise independent, since the conditions for the independence of two events (11)-(13) are satisfied:

P(A)=P(A|B)=0.5; P(B)=P(B|C)=0.5; P(C)=P(C|A)=0.5.

However, P(A|BC)=0P(A); P(A|C)=1P(A); P(B|AC)=0P(B); P(C|AB)=0P(C).

Comment. Pairwise independence of random events does not mean their independence in the aggregate.

Random events are collectively independent if the probability of occurrence of each of them does not change with the occurrence of any combination of other events. For random events A 1, A 2, ... A n, independent in the aggregate, the following probability multiplication theorem is valid (a necessary and sufficient condition for independence in the aggregate of n random events):

P(A 1 A 2 ... A n) \u003d P (A 1) P (A 2) ... P (A n). (fourteen)

For example 21, condition (14) is not satisfied: P(ABC)=0P(A)P(B)P(C)=0.50.50.5=0.125. Therefore, pairwise independent events A, B and C are mutually dependent.

Example 22

There are 12 transistors in the box, three of which are faulty. To assemble a two-stage amplifier, two transistors are randomly removed. What is the probability that the assembled amplifier will be faulty?

E: selection of two transistors from the box with 9 good and 3 bad transistors;

A=(faulty assembled amplifier); P(A)=?

Solution. Obviously, the assembled two-stage amplifier will be faulty if at least one of the two transistors selected for assembly is faulty. Therefore, we redefine event A as follows:

A=(at least one of the two selected transistors is faulty);

Let us define the following auxiliary random events:

A 01 = (only the first of the two selected transistors is faulty);

A 10 = (only the second of the two selected transistors is faulty);

A 00 = (both selected transistors are faulty);

It is obvious that A=A 01 A 10 A 00 (for the event A to occur, at least one of the events A 01 or A 10 or A 00 must occur), and the events A 01, A 10 and A 00 are incompatible (they cannot occur together) , so we find the probability of an event using the probability addition theorem incompatible events:

P(A)=P(A 01 A 10 A 00)=P(A 01)+P(A 10)+P(A 00).

To determine the probabilities of events A 01, A 10 and A 00, we introduce auxiliary events:

B 1 =(the first selected transistor is defective);

B 2 =(Second selected transistor is defective).

It is obvious that A 01 =B 1 ; A 10 = B 2 ; A 00 = B 1 B 2 ; therefore, to determine the probabilities of events A 01, A 10 and A 00, we apply the probabilities multiplication theorem.

P(A 01)=P(B 1)=P(B 1)P(|B 1),

where P(B 1) is the probability that the first selected transistor will be faulty; P(|B 1) - the probability that the second selected transistor will be good, provided that the first selected transistor is faulty. Using the classical way of calculating probabilities, P(B 1)=3/12, and P(|B 1)=9/11 (because after choosing the first bad transistor, there are 11 transistors left in the box, 9 of which are good).

Thus, P(A 01)=P(B 1)=P(B 1)P(|B 1)=3/129/11=0.20(45). Similarly:

P(A 10)=P(B 2)=P()P(B 2 |)=9/123/11=0.20(45);

P(A 00)=P(B 1 B 2)=P(B 1)P(B 2 |B 1)=3/122/11=0.041(6).

Let us substitute the obtained values ​​of the probabilities A 01, A 10 and A 00 into the expression for the probability of the event A:

P(A)=P(A 01 A 10 A 00)=P(A 01)+P(A 10)+P(A 00)=3/129/11+9/123/11+3/122/11 =0.45(45).

Answer: The probability that the assembled amplifier will be faulty is 0.4545.

Let be BUT And IN are the two events considered in this test. In this case, the occurrence of one of the events may affect the possibility of the occurrence of another. For example, the occurrence of an event BUT can influence the event IN or vice versa. To take into account such dependence of some events on others, the concept of conditional probability is introduced.

Definition. If the probability of an event IN is located under the condition that the event BUT happened, then the resulting probability of the event IN called conditional probability developments IN. The following symbols are used to denote such a conditional probability: R BUT ( IN) or R(IN / BUT).

Remark 2. In contrast to the conditional probability, the “unconditional” probability is also considered, when any conditions for the occurrence of some event IN missing.

Example. An urn contains 5 balls, 3 of which are red and 2 are blue. In turn, one ball is drawn from it with a return and without a return. Find the conditional probability of drawing a red ball for the second time, provided that the first time taken is: a) a red ball; b) a blue ball.

Let the event BUT is drawing the red ball for the first time, and the event IN– extracting the red ball for the second time. It's obvious that R(BUT) = 3 / 5; then in the case when the ball taken out for the first time is returned to the urn, R(IN)=3/5. In the case when the drawn ball is not returned, the probability of drawing a red ball R(IN) depends on which ball was drawn for the first time - red (event BUT) or blue (event). Then in the first case R BUT ( IN) = 2 / 4, and in the second ( IN) = 3 / 4.

The theorem of multiplication of the probabilities of events, one of which takes place under the condition of the other

The probability of the product of two events is equal to the product of the probability of one of them by the conditional probability of the other, found under the assumption that the first event occurred:

R(A ∙ B) = R(BUT) ∙ R BUT ( IN) . (1.7)

Proof. Indeed, let ntotal number equally possible and incompatible (elementary) test outcomes. Let it go n 1 - the number of outcomes that favor the event BUT, which occurs at the beginning, and m- the number of outcomes in which the event occurs IN assuming that the event BUT has come. In this way, m is the number of outcomes that favor the event IN. Then we get:

Those. the probability of the product of several events is equal to the product of the probability of one of these events by the conditional probabilities of the others, and the conditional probability of each subsequent event is calculated on the assumption that all previous events have occurred.

Example. There are 4 masters of sports in a team of 10 athletes. By drawing lots, 3 athletes are selected from the team. What is the probability that all the selected athletes are masters of sports?

Solution. Let us reduce the problem to the “urn” model, i.e. Let's assume that there are 4 red balls and 6 white ones in an urn containing 10 balls. 3 balls are drawn at random from this urn (selection S= 3). Let the event BUT consists in extracting 3 balls. The problem can be solved in two ways: by the classical scheme and by formula (1.9).

The first method based on the combinatorics formula:

The second method (by formula (1.9)). 3 balls are drawn consecutively from the urn without replacement. Let be BUT 1 - the first drawn ball is red, BUT 2 - the second drawn ball is red, BUT 3 - the third drawn ball is red. Let also the event BUT means that all 3 drawn balls are red. Then: BUT = BUT 1 ∙ (BUT 2 / BUT 1) ∙ BUT 3 / (BUT 1 ∙ BUT 2), i.e.

Example. Let from the set of cards a, a, r, b, o, t cards are drawn one at a time. What is the probability of getting the word " Job” when sequentially folding them into one line from left to right?

Let be IN- the event at which the declared word is obtained. Then by formula (1.9) we get:

R(IN) = 1/6 ∙ 2/5 ∙ 1/4 ∙ 1/3 ∙ 1/2 ∙ 1/1 = 1/360.

The probability multiplication theorem takes on its simplest form when the product is formed by events independent of each other.

Definition. Event IN called independent from the event BUT if its probability does not change regardless of whether the event occurred BUT or not. Two events are called independent (dependent) if the occurrence of one of them does not change (changes) the probability of occurrence of the other. Thus, for independent events p(B/A) = R(IN) or = R(IN), and for dependent events R(IN/A)

When assessing the probability of the occurrence of any random event, it is very important to have a good idea in advance whether the probability (probability of the event) of the occurrence of the event of interest to us depends on how other events develop. In the case of the classical scheme, when all outcomes are equally probable, we can already estimate the probability values ​​of the individual event of interest to us on our own. We can do this even if the event is a complex set several elementary outcomes. And if several random events occur simultaneously or sequentially? How does this affect the probability of the event of interest to us? If I roll a die a few times and want to get a six and I'm not lucky all the time, does that mean I should increase my bet because, according to probability theory, I'm about to get lucky? Alas, probability theory says nothing of the sort. Neither dice, nor cards, nor coins can remember what they showed us last time. It does not matter to them at all whether for the first time or for the tenth time today I test my fate. Every time I roll again, I know only one thing: and this time the probability of rolling a "six" again is one-sixth. Of course, this does not mean that the number I need will never fall out. It only means that my loss after the first toss and after any other toss are independent events. Events A and B are called independent if the realization of one of them does not affect the probability of the other event in any way. For example, the probabilities of hitting a target with the first of two guns do not depend on whether the other gun hit the target, so the events "the first gun hit the target" and "the second gun hit the target" are independent. If two events A and B are independent, and the probability of each of them is known, then the probability of the simultaneous occurrence of both event A and event B (denoted by AB) can be calculated using the following theorem.

Probability multiplication theorem for independent events

P(AB) = P(A)*P(B) the probability of the simultaneous occurrence of two independent events is equal to the product of the probabilities of these events.

Example 1. The probabilities of hitting the target when firing the first and second guns are respectively equal: p 1 = 0.7; p2 = 0.8. Find the probability of hitting with one volley by both guns simultaneously.

As we have already seen, the events A (hit by the first gun) and B (hit by the second gun) are independent, i.e. P(AB)=P(A)*P(B)=p1*p2=0.56. What happens to our estimates if the initiating events are not independent? Let's change the previous example a little.

Example 2 Two shooters in a competition shoot at targets, and if one of them shoots accurately, then the opponent starts to get nervous, and his results worsen. How to turn this worldly situation into math problem and outline ways to solve it? It is intuitively clear that it is necessary to somehow separate the two scenarios, to compose, in fact, two scenarios, two different tasks. In the first case, if the opponent misses, the scenario will be favorable for the nervous athlete and his accuracy will be higher. In the second case, if the opponent decently realized his chance, the probability of hitting the target for the second athlete is reduced. To separate the possible scenarios (they are often called hypotheses) of the development of events, we will often use the "probability tree" scheme. This diagram is similar in meaning to the decision tree, which you have probably already had to deal with. Each branch is a separate scenario for the development of events, only now it has eigenvalue the so-called conditional probability (q 1 , q 2 , q 1 -1, q 2 -1).

This scheme is very convenient for the analysis of successive random events. It remains to clarify one more important question: where do the initial values ​​of probabilities come from in real situations? After all, the theory of probability does not work with the same coins and dice, does it? Usually these estimates are taken from statistics, and when statistics are not available, we conduct our own research. And we often have to start it not with collecting data, but with the question of what information we generally need.

Example 3 In a city of 100,000 inhabitants, suppose we need to estimate the size of the market for a new non-essential product, such as a color-treated hair conditioner. Let's consider the "tree of probabilities" scheme. In this case, we need to approximately estimate the value of the probability on each "branch". So, our estimates of market capacity:

1) 50% of all residents of the city are women,

2) of all women, only 30% dye their hair often,

3) of these, only 10% use balms for colored hair,

4) of these, only 10% can muster up the courage to try a new product,

5) 70% of them usually buy everything not from us, but from our competitors.


According to the law of multiplication of probabilities, we determine the probability of the event of interest to us A \u003d (a city resident buys this new balm from us) \u003d 0.00045. Multiply this probability value by the number of inhabitants of the city. As a result, we have only 45 potential buyers, and given that one vial of this product lasts for several months, the trade is not very lively. Still, there are benefits from our assessments. Firstly, we can compare the forecasts of different business ideas, they will have different “forks” on the diagrams, and, of course, the probability values ​​will also be different. Secondly, as we have already said, random value It is not called random because it does not depend on anything at all. It's just that its exact meaning is not known in advance. We know that the average number of buyers can be increased (for example, by advertising a new product). So it makes sense to focus on those "forks" where the distribution of probabilities does not particularly suit us, on those factors that we are able to influence. Consider another quantitative example of consumer behavior research.

Example 3 An average of 10,000 people visit the food market per day. The probability that a market visitor walks into a dairy pavilion is 1/2. It is known that in this pavilion, on average, 500 kg of various products are sold per day. Can it be argued that the average purchase in the pavilion weighs only 100 g?

Discussion.

Of course not. It is clear that not everyone who entered the pavilion ended up buying something there.


As shown in the diagram, in order to answer the question about the average purchase weight, we must find the answer to the question, what is the probability that a person who enters the pavilion buys something there. If we do not have such data at our disposal, but we need them, we will have to obtain them ourselves, after observing the visitors of the pavilion for some time. Suppose our observations show that only a fifth of the visitors to the pavilion buy something. As soon as these estimates are obtained by us, the task becomes already simple. Of the 10,000 people who came to the market, 5,000 will go to the pavilion of dairy products, there will be only 1,000 purchases. The average purchase weight is 500 grams. It is interesting to note that in order to build a complete picture of what is happening, the logic of conditional "branching" must be defined at each stage of our reasoning as clearly as if we were working with a "concrete" situation, and not with probabilities.

Tasks for self-examination.

1. Let it eat electrical circuit, consisting of n series-connected elements, each of which works independently of the others. The probability p of non-failure of each element is known. Determine the probability of proper operation of the entire section of the circuit (event A).


2. The student knows 20 of the 25 exam questions. Find the probability that the student knows the three questions given to him by the examiner.

3. Production consists of four successive stages, each of which operates equipment for which the probabilities of failure within the next month are, respectively, p 1 , p 2 , p 3 and p 4 . Find the probability that in a month there will be no stoppage of production due to equipment failure.

We already know that probability is a numerical measure of the possibility of a random event occurring, i.e. an event that may or may not occur under a certain set of conditions. When the set of conditions changes, the probability of a random event may change. As an additional condition, we can consider the occurrence of another event. So, if the set of conditions under which a random event occurs BUT, add one more, consisting in the occurrence of a random event IN, then the probability of the event occurring BUT will be called conditional.

Conditional probability of event AThe probability that event A will occur given that event B has occurred. The conditional probability is denoted (A).

Example 16 A box contains 7 white and 5 black balls, differing only in color. The experiment consists in the fact that one ball is randomly taken out and, without lowering it back, another ball is taken out. What is the probability that the second ball drawn is black if the first ball drawn is white?

Solution.

We have two random events: the event BUT- the first ball drawn is white, IN– the second drawn ball is black. A and B are incompatible events, let's use the classical definition of probability. The number of elementary outcomes when drawing the first ball is 12, and the number of favorable outcomes to get the white ball is 7. Therefore, the probability P(A) = 7/12.

If the first ball is white, then the conditional probability of the event IN- the appearance of the second black ball (assuming that the first ball was white) is equal to (IN)= 5/11, since there are 11 balls left before the second ball is taken out, of which 5 are black.

Note that the probability of a black ball appearing on the second drawing would not depend on the color of the first ball drawn if, having taken the first ball, we put it back into the box.

Consider two random events A and B. Let the probabilities P(A) and (B) be known. Let us determine what the probability of occurrence of both event A and event B is, i.e. products of these events.

Probability multiplication theorem. The probability of the product of two events is equal to the product of the probability of one of them by the conditional probability of the other, calculated under the condition that the first event occurred:

P (A × B) \u003d P (A) × (B) .

Since for calculating the probability of a product it does not matter which of the considered events BUT And IN was the first, and which was the second, then you can write:

P(A×B) = P(A) × (B) = P(B) × (A).

The theorem can be extended to the product of n events:

P (A 1 A 2. A p) \u003d P (A x) P (A 2 / A 1) .. P (A p / A 1 A 2 ... A p-1).

Example 17. For the conditions of the previous example, calculate the probability of drawing two balls: a) a white ball first, and a black ball second; b) two black balls.

Solution.

a) From the previous example, we know the probabilities of getting a white ball first and a black ball second from the box, provided that the white ball was taken first. To calculate the probability of both events occurring together, we use the probabilities multiplication theorem: P (A × B) \u003d P (A) × (B) \u003d .

b) Similarly, we calculate the probability of drawing two black balls. Probability of getting the black ball first . Probability of drawing a black ball a second time, provided that we do not put the first black ball drawn back into the box (there are 4 black balls left, and the total number of balls is 11). The resulting probability can be calculated using the formula P (A × B) \u003d P (A) × (B) 0,152.

The probability multiplication theorem has a simpler form if the events A and B are independent.

An event B is said to be independent of event A if the probability of event B does not change whether event A occurs or not. If event B is independent of event A, then its conditional (B) is equal to the usual probability P(B):

It turns out that if the event IN will be event independent BUT, then the event BUT will be independent of IN, i.e. (A)=P(A).

Let's prove it. Substitute the equality from the definition of event independence IN from the event BUT into the probability multiplication theorem: P (A × B) \u003d P (A) × (B) \u003d P (A) × (B). But in other way P(A×B)= P(B) × (A). Means P(A) × (B)= P(B) × (A) And (A)=P(A).

Thus, the property of independence (or dependence) of events is always mutual and can be given the following definition: two events are called independent if the occurrence of one of them does not change the probability of occurrence of the other.

It should be noted that the independence of events is based on the independence of the physical nature of their origin. This means that the sets of random factors leading to one or another outcome of testing one and another random event are different. So, for example, hitting a target by one shooter does not affect in any way (unless, of course, you come up with any exotic reasons) the probability of hitting the target by a second shooter. In practice, independent events occur very often, since causality phenomena in many cases is absent or insignificant.

Probability multiplication theorem for independent events. The probability of the product of two independent events is equal to the product of the probabilities of these events: P(A×B) = P(A) × P(B).

The following corollary follows from the probability multiplication theorem for independent events.

If events A and B are incompatible and P(A)¹0, P(B)¹0, then they are dependent.

Let us prove this by contradiction. Let us assume that incompatible events BUT And IN independent. Then P(A×B) = P(A)×P(B). And since P(A)¹0, P(B)¹0, i.e. developments BUT And IN are not impossible, then P(A×B)¹0. But, on the other hand, the event BUTž IN is impossible as a product of incompatible events (this was discussed above). Means P(A×B)=0. got a contradiction. Thus, our initial assumption is incorrect. Developments BUT And IN- dependent.

Example 18. Let us now return to the unsolved problem of two shooters shooting at the same target. Recall that if the probability of hitting the target by the first shooter is 0.8, and the second is 0.7, it is necessary to find the probability of hitting the target.

Developments BUT And IN- hitting the target, respectively, by the first and second shooters - joint, therefore, to find the probability of the sum of events BUT + IN- hitting a target by at least one shooter - you must use the formula: P(A+B) \u003d P (A) + P (B)P(Až IN). Developments BUT And IN independent, therefore P(A × B) = P(A) × P(B).

So, P(A+B) \u003d P (A) + P (B) - P(A) × P(B).

P(A+B)= 0.8 + 0.7 - 0.8 × 0.7 = 0.94.

Example 19.

Two independent shots are fired at the same target. The probability of hitting with the first shot is 0.6, and with the second - 0.8. Find the probability of hitting the target with two shots.

1) Denote the hit on the first shot as an event
A 1 , with the second - as an event A 2 .

Hitting the target involves at least one hit: either only on the first shot, or only on the second, or both on the first and on the second. Therefore, in the problem it is required to determine the probability of the sum of two joint events A 1 and A 2:

P (A 1 + A 2) \u003d P (A 1) + P (A 2) -P (A 1 A 2).

2) Since the events are independent, then P (A 1 A 2) \u003d P (A 1) P (A 2).

3) We get: P (A 1 + A 2) \u003d 0.6 + 0.8 - 0.6 0.8 \u003d 0.92.
If the events are incompatible, then P(A B) = 0 and P(A + B) = P(A) + P(B).

Example 20.

An urn contains 2 white, 3 red and 5 blue balls of the same size. What is the probability that a ball drawn at random from the urn will be colored (not white)?

1) Let the event A be the extraction of a red ball from the urn,
event B - extraction of the blue ball. Then the event (A + B)
is the extraction of a colored ball from the urn.

2) P(A) = 3/10, P(B) = 5/10.

3) Events A and B are incompatible, since only
one ball. Then: P(A + B) = P(A) + P(B) = 0.3 + 0.5 = 0.8.

Example 21.

An urn contains 7 white and 3 black balls. What is the probability of: 1) drawing a white ball from the urn (event A); 2) drawing a white ball from the urn after removing one white ball from it (event B); 3) extracting a white ball from the urn after removing one ball from it, which is black (event C)?

1) Р(А) = = 0.7 (see classical probability).

2) P B (A) = = 0, (6).

3) P C (A) = | = 0,(7).

Example 22.

The mechanism is assembled from three identical parts and is considered inoperable if all three parts are out of order. There are 15 parts left in the assembly shop, of which 5 are non-standard (defective). What is the probability that the mechanism assembled from the remaining parts taken at random will be inoperable?

1) Denote the desired event through A, the choice of the first non-standard part through A 1, the second through A 2, the third through A 3

2) Event A will occur if both event A 1 and event A 2 and event A 3 occur, i.e.

A \u003d A 1 A 2 A 3,

since the logical "and" corresponds to the product (see section "Propositional Algebra. Logical Operations").

3) Events A 1, A 2, A 3 are dependent, therefore P (A 1 A 2 A 3) =
\u003d P (A 1) P (A 2 / A 1) P (A 3 / A 1 A 2).

4) P (A 1) \u003d, P (A 2 / A 1) \u003d, P (A 3 / A 1 A 2) \u003d. Then

P (A 1 A 2 A 3) \u003d 0.022.

For independent events: P(A B) = P(A) P(B).

Based on the above, the criterion for the independence of two events A and B:

P (A) \u003d P B (A) \u003d P (A), P (B) \u003d P A (B) \u003d P (B).

Example 23.

The probability of hitting the target by the first shooter (event A) is 0.9, and the probability of hitting the target by the second shooter (event B) is 0.8. What is the probability that the target will be hit by at least one shooter?

1) Let C be the event of interest to us; the opposite event is that both arrows missed.

3) Since when shooting one shooter does not interfere with the other, the events and are independent.

We have: Р() = Р() Р() = =(1 - 0.9) (1 - 0.8) =

0,1 0,2 = 0,02.

4) P(C) = 1 -P() = 1 -0.02 = 0.98.

Total Probability Formula

Let the event A can occur as a result of the manifestation of one and only one event Н i (i = 1,2,... n) from some complete group of incompatible events H 1 , H 2,… H n . The events of this group are usually called hypotheses.

Total Probability Formula. The probability of an event A is equal to the sum of the pairwise products of the probabilities of all hypotheses that form a complete group and the corresponding conditional probabilities of this event A:

P(A) = , where = 1.

Example 24.

There are 3 identical urns. The first urn contains 2 white and 1 black balls, the second urn contains 3 white and 1 black ball, and the third urn contains 2 white and 2 black balls. 1 ball is selected from an urn chosen at random. What is the probability that he will be white?

All urns are considered the same, therefore, the probability of choosing the i-th urn is

Р(H i) = 1/3, where i = 1, 2, 3.

2) Probability of drawing a white ball from the first urn: (A) = .

Probability of drawing a white ball from the second urn: (A) = .

Probability of drawing a white ball from the third urn: (A) = .

3) The desired probability:

P(A) = =0.63(8)

Example 25.

The store receives for sale the products of three factories, the relative shares of which are: I - 50%, II - 30%, III - 20%. For the products of factories, the marriage is respectively: I - 2%, P - 2%, III - 5%. What is the probability that a product of this product, randomly purchased in a store, will be of good quality (event A)?

1) The following three hypotheses are possible here: H 1 , H 2, H 3 -
the purchased item was worked out, respectively, at factories I, II, III; the system of these hypotheses is complete.

Probabilities: P(H 1) = 0.5; P (H 2) \u003d 0.3; P (H 3) \u003d 0.2.

2) The corresponding conditional probabilities of event A are: (A) = 1-0.02 = 0.98; (A)=1-0.03=0.97; (A) == 1-0.05 = 0.95.

3) According to the total probability formula, we have: P (A) \u003d 0.5 0.98 + + 0.3 0.97 + 0.2 0.95 \u003d 0.971.

Posterior probability formula (Bayes formula)

Consider the situation.

There is a complete group of inconsistent hypotheses H 1 , H 2, … H n , whose probabilities (i = 1, 2, ... n) are known before experiment (a priori probabilities). An experiment (test) is carried out, as a result of which the occurrence of event A is registered, and it is known that our hypotheses attributed certain probabilities to this event (i = 1, 2, ... n). What will be the probabilities of these hypotheses after the experiment (a posteriori probabilities)?

The answer to a similar question is given by the posterior probability formula (Bayes formula):

, where i=1,2, ...p.

Example 26.

The probability of hitting an aircraft in a single shot for the 1st missile system (event A) is 0.2, and for the 2nd (event B) - 0.1. Each of the complexes fires one shot, and one hit on the aircraft is registered (event C). What is the probability that the successful shot belongs to the first missile system?

Solution.

1) Before the experiment, four hypotheses are possible:

H 1 \u003d A B - the aircraft was hit by the 1st complex and the aircraft was hit by the 2nd complex (the product corresponds to the logical "and"),

H 2 \u003d A B - the aircraft was hit by the 1st complex and the aircraft was not hit by the 2nd complex,

H 3 \u003d A B - the aircraft is not affected by the 1st complex and the aircraft is affected by the 2nd complex,

H 4 = A B - the aircraft is not affected by the 1st complex and the aircraft is not affected by the 2nd complex.

These hypotheses form a complete group of events.

2) Corresponding probabilities (with independent action of complexes):

P (H 1) \u003d 0.2 0.1 \u003d 0.02;

P(H 2) = 0.2 (1-0.1) = 0.18;

P (H 3) \u003d (1-0.2) 0.1 \u003d 0.08;

P (H 4) \u003d (1-0.2) (1-0.1) \u003d 0.72.

3) Since the hypotheses form a complete group of events, the equality = 1 must hold.

We check: P (H 1) + P (H 2) + P (H 3) + P (H 4) \u003d 0.02 + 0.18 + + 0.08 + 0.72 \u003d 1, thus, the group under consideration hypotheses is correct.

4) Conditional probabilities for the observed event C under these hypotheses will be: (C) = 0, since according to the condition of the problem one hit was registered, and the hypothesis H 1 assumes two hits:

(C)=1; (C) = 1.

(C) = 0, since according to the condition of the problem, one hit was registered, and the hypothesis H 4 assumes the absence of hits. Therefore, the hypotheses H 1 and H 4 are dropped.

5) The probabilities of hypotheses H 2 and H 3 are calculated using the Bayes formula:

0,7, 0,3.

Thus, with a probability of approximately 70% (0.7), it can be argued that a successful shot belongs to the first missile system.

5.4. random variables. Distribution law of a discrete random variable

Quite often, in practice, such tests are considered, as a result of which a certain number is randomly obtained. For example, when throwing a dice, a number of points from 1 to 6 falls out, when taking 6 cards from a deck, you can get from 0 to 4 aces. For a certain period of time (say, a day or a month), a certain number of crimes are registered in the city, a certain number of traffic accidents occur. A shot is fired from the gun. The range of the projectile also takes on a random value.

In all these tests, we are faced with the so-called random variables.

A numerical value that takes one or another value as a result of the implementation of the test in a random way is called random variable.

The concept of a random variable plays a very important role in probability theory. If the "classical" theory of probability studied mainly random events, then the modern theory of probability mainly deals with random variables.

Further, we will denote random variables by uppercase Latin letters X, Y, Z, etc., and their possible values ​​by the corresponding lowercase x, y, z. For example, if a random variable has three possible values, then we will denote them as follows: , , .

So, examples of random variables can be:

1) the number of points rolled on the top face of the dice:

2) the number of aces, when taking 6 cards from a deck;

3) the number of registered crimes per day or month;

4) the number of hits on the target with four pistol shots;

5) the distance that the projectile will fly when fired from the gun;

6) the height of a randomly taken person.

It can be seen that in the first example, the random variable can take one of six possible values: 1, 2, 3, 4, 5, and 6. In the second and fourth examples, the number of possible values ​​of the random variable is five: 0, 1, 2, 3, 4 In the third example, the value of a random variable can be any (theoretically) natural number or 0. In the fifth and sixth examples, the random variable can take on any real value from a certain interval ( but, b).

If a random variable can take on a finite or countable set of values, then it is called discrete(discretely distributed).

continuous A random variable is such a random variable that can take on all values ​​from some finite or infinite interval.

To specify a random variable, it is not enough to list its possible values. For example, in the second and third examples, random variables could take the same values: 0, 1, 2, 3, and 4. However, the probabilities with which these random variables take their values ​​will be completely different. Therefore, to specify a discrete random variable, in addition to the list of all its possible values, one must also indicate their probabilities.

The correspondence between the possible values ​​of a random variable and their probabilities is called distribution law discrete random variable. , …, Х=

The distribution polygon, as well as the distribution series, completely characterizes the random variable. It is one of the forms of the law of distribution.

Example 27. A coin is tossed randomly. Construct a row and a polygon for the distribution of the number of coats of arms that have fallen.

A random value equal to the number of coats of arms that fell out can take two values: 0 and 1. The value 1 corresponds to the event - the loss of the coat of arms, the value 0 - the loss of tails. The probabilities of getting a coat of arms and getting tails are the same and equal. Those. the probabilities with which the random variable takes the values ​​0 and 1 are equal. The distribution series has the form:

X
p