Current Affairs 12th Class

Notes - Mathematics Olympiads - Probability

Category : 12th Class

 

                                                                               Probability

 

 

  • Conditional Probability: The probability of event A is called the conditional probability of A given that event B has already occurred. It is written as \[P(A|B)\]or \[P\left( \frac{A}{B} \right)\].Mathematically it is given by the formula \[P(A|B)=\frac{P(A\bigcap B)}{P(B)}\]

Event A is independent of B if the conditional probability of A given B is the same as the unconditional probability as A,i.e, they are independent if \[P(A|B)=P(A)\]

'Gender gap' in politics is a well-known example of conditional probability and independence in the real life.

Suppose candidate A receive 55% of the entire vote and the only 45% of the female vote. Let \[P(R)=\] Probability that a random person voted for A But \[P\left( \frac{W}{R} \right)=\] Probability of a random women voted for A

\[\therefore P(R)=0.55\]and \[P\left( \frac{W}{R} \right)=0.45\]

Then \[P\left( \frac{W}{R} \right)\] is said to be the conditional probability of W given R.

  • If \[P(R)\ne P\left( \frac{W}{R} \right)\] then there exists a gender gap in politics. On the other hand. If \[P(A)=P\left( \frac{A}{W} \right),\] then there is no gender gap. i.e. the probability that a person voted for A is independent of the gender gap.

 

  • Independent Events: Let A and B be the two events. If\[P(A\bigcap B)=P(A).\,P(B)\], then A and B are said to be the independent events.

 

  • Mutually Exclusive Events: The events are said to be mutually exclusive, if the sets are disjoints i.e. \[P(A\bigcap B)=0\]i.e. \[A\bigcap B=\phi \]

In such cases, \[P(A\bigcup B)=P(A)+P(B)\]

 

  • Disjoint events/sets: Two sets or events are said to be disjoint if they have no element in common.

 

  • Properties of Conditional probability
  1. If A and B be the events of a sample space Sand F is an event of S, then

\[P(S|F)=P(F|F)=1\]

  1. If A and B are any two events of sample space S and F is an event of S such that \[P(F)\ne 0,\] then

            \[P[(A\bigcap B)|F]=P(A|F)+P(B|F)-P[A\bigcap B)|F]\] If A and B are disjoint events, then

\[P[(A\bigcap B)|F]=P(A|F)+P(B|F)\]

  1. \[P(A'|B)=1-P(A|B)\]

 

  • Law of Total Probability: Suppose S is a sample space and the subsets be \[{{A}_{1}},{{A}_{2}},{{A}_{3}},...{{A}_{k}},\] then any other event E is union of all the subsets. When \[E\cap {{A}_{k}}\] are disjoint, then

\[P(E)=P(E\cap {{A}_{1}})+P(E\cap {{A}_{2}})+P(E\cap {{A}_{k}})\]

Using the conditional theorem of probability, we have

\[P(E\cap {{A}_{k}})=P({{A}_{k}}\cap E)=P({{A}_{k}}).P\left( \frac{E}{{{A}_{k}}} \right)\]

 

  • Theorem of Total Probability: Let \[\{{{A}_{1}},{{A}_{2}},...{{A}_{n}}\}\] be a partition of sample space S and each of the event has non-zero probability. Let E be any event associated with S, then

\[P(E)=P({{A}_{1}}).P\left( \frac{E}{{{A}_{1}}} \right)+P({{A}_{2}}).P\left( \frac{E}{{{A}_{2}}} \right)+...+P({{A}_{n}}).P\left( \frac{E}{{{A}_{n}}} \right)\]

\[=\sum\limits_{i=1}^{n}{n}({{A}_{i}})P\left( \frac{E}{{{A}_{i}}} \right)\]

This is known as theorem of total probability.

 

  • Bayes' Theorem: Suppose the events \[{{A}_{1}},{{A}_{2}},...{{A}_{n}}\] form the partitions of the sample space S, i.e., \[{{A}_{1}},{{A}_{2}},...{{A}_{n}}\] are pairwise disjoint and \[{{A}_{1}}\cup {{A}_{2}}\cup ...\cup {{A}_{n}}=S\] and E is any event. Then

\[P\left( \frac{{{A}_{i}}}{E} \right)=\frac{P({{A}_{i}}).P\left( \frac{E}{{{A}_{i}}} \right)}{P({{A}_{1}}).P\left( \frac{E}{{{A}_{1}}} \right)+P({{A}_{2}}).P\left( \frac{E}{{{A}_{2}}} \right)+...P({{A}_{n}}).P\left( \frac{E}{{{A}_{n}}} \right)}\]

 

  • Random Variable: In probability, random variables play an important role. A random variable is a special kind of function. It is the real-valued function: S\[\to \]R, whose domain is the sample space of the random experiment.

 

It is always denoted by capital letters X, Y, Z... etc.

 

  • Discrete Random Variable: A random variable which can take any only finite or countable infinite numbers of values is said to be the discrete random variable.

 

  • Continuous Random Variable: A random variable which can take any value between given limits is called continuous random variable.

 

  • Sum and Product of Random Variable: Let X and Y be the random variables on the some sample space S. Then X + Y, X + K, KX and XY where K is any real number, are the functions of S defined as

\[(X+Y)(S)=X(S)+Y(S)\].

\[(K\,X)(S)=K\,X(S).\]

\[(X+K)S=X(S)+K.\]

\[(X\,Y)(S)=X(S).Y(S).\]

 

  • Probability Distribution of Random Variable: Let X be a finite random variable on a sample space S. i.e. X is assigned or correspondent only to a finite number of values of S.

\[{{R}_{x}}=\{{{x}_{1}},{{x}_{2}},{{x}_{3}},...{{x}_{n}},\},\]provided\[({{x}_{1}}<{{x}_{2}}<{{x}_{3}}<{{x}_{4}}..{{x}_{n}}\}.\]Then X induces a function f which assign the probability to the points in\[{{R}_{x}}.\]

\[\therefore \]      \[f({{x}_{k}})=P(X={{x}_{k}})=P\{\}s\in S:X(S)={{x}_{k}}\}\]

\[\therefore \]      In the set of ordered pair. It can be written as \[\{{{x}_{k}},f({{x}_{k}})\}.\]

\[\therefore \]     

\[x\]

\[{{x}_{1}}\]

\[{{x}_{2}}\]

\[{{x}_{3}}\]

\[{{x}_{4}}\]

\[f(x)\]

\[f({{x}_{1}})\]

\[f({{x}_{2}})\]

\[f({{x}_{3}}).....\]

\[f({{x}_{4}})\]

 

Here, this function f is said to be the probability distribution of the random variable X.

\[\therefore \]      \[f({{x}_{i}})\ge 0\]and\[\sum f({{x}_{k}})=1\]

 

Solved Example

  1. If two coins are tossed, then find the probability of occurring 0, 1 and 2 heads.

Sol. Let X be the event of occurring head.

Now,

\[P(X=0)=P(T,T)=\frac{1}{2}.\frac{1}{2}=\frac{1}{4}\]

\[P(X=1)=P(H,T)+P(T\,H)=\left[ \frac{1}{2}\times \frac{1}{2} \right]+\left[ \frac{1}{2}\times \frac{1}{2} \right]=\frac{1}{4}+\frac{1}{4}=\frac{1}{2}\]

\[P(X=2)=P(H\,H)=\frac{1}{2}.\frac{1}{2}=\frac{1}{4}\]

In the table from it can be shown as follows:

 

\[x:\]

0

1

2

\[p(x):\]

\[\frac{1}{4}\]

\[\frac{1}{2}\]

\[\frac{1}{4}\].

 

  • Expectation of a finite random Variable: Let x be a finite random variable and suppose its distributions are:

 

\[x\]

\[{{x}_{1}}\]

\[{{x}_{2}}\]

\[{{x}_{3}}\]

.

…. …

\[{{x}_{n}}\]

\[f(x)\]

\[f({{x}_{1}})\]

\[f({{x}_{2}})\]

\[f({{x}_{3}})\]

 

\[f({{x}_{n}})\]

 

The mean expected value (expectation) of x is denoted as \[E(x)\] and is defined by \[E=E(x)={{x}_{1}}.f({{x}_{1}})+{{x}_{2}}\]

\[f({{x}_{2}})+...{{x}_{n}}+f({{x}_{n}}).=\sum {{x}_{i}}f({{x}_{i}})\]

\[E(x)=\sum {{x}_{i}}{{P}_{i}}={{x}_{1}}{{P}_{1}}+{{x}_{2}}{{P}_{2}}+...{{x}_{n}}.{{P}_{n}}+[\because f({{x}_{i}})={{P}_{i}}]\]

 

Solved Example

  1. If X and Y be the random variables with the distributions

           

\[{{X}_{i}}\]

2

3

6

10

\[{{P}_{i}}\]

0.2

0.2

0.5

0.1

            and

\[{{Y}_{i}}\]

0.8

0.2

0

3

7

\[{{P}_{i}}\]

0.2

0.3

0.1

0.3

0.1

                       

 

 

 

Then find the expected value of X and Y.

Sol.      \[E(x)=\sum {{X}_{i}}{{P}_{i}}\]

            \[=2\times (0.2)+3\times 0.2+6\times 0.5+10\times 0.1\]

            \[=0.4+0.6+3.0+1.0=5\]

            \[E(Y)=\sum {{Y}_{i}}{{P}_{i}}\]

            \[=(0.8)\times (0.2)+(0.2)(0.3)+0\times 0.1+3\times 0.3+7\times 0.1\]

            \[=0.16+0.06+0+0.9+0.7=0.38\]

 

  • Variance: For a random variable A whose possible values \[{{a}_{1}},{{a}_{2}}\]……...\[{{a}_{n}}\] occur with the probabilities respectively

\[p({{a}_{1}}),\]\[p({{a}_{2}}),\]-------\[p({{a}_{n}})\]respectively

Let \[\mu =E(A)\] be the mean of A. Then the variance of A is given by

Var(A) or \[\sigma _{a}^{2}=E{{(A-\mu )}^{2}}\]

 

  • Relation between Variance and Expectation

Var\[(A)=E({{A}^{2}})-{{[E(A)]}^{2}}\],

where\[E({{A}^{2}})=\underset{i=1}{\overset{n}{\mathop{\sum }}}\,a_{i}^{2}p({{a}_{i}})\]

 

  • Binomial Distribution: Consider a random experiment and an event E associated with it. Let p = probability of occurrence of event E in the one trial and \[q=1-p=\] Probability of non-occurrence of event E in one trial.

If X denotes the number of success in n trials of the random experiment. Then \[P(X=r)=\] Probability of r successes \[{{=}^{n}}{{C}_{r}}{{p}^{r}}.{{q}^{n-r}}\]

Note:

  1. Probability of at most r successes inn trials \[=\sum\limits_{n=0}^{r}{^{n}{{C}_{r}}.{{p}^{r}}.{{q}^{n-r}}}\]
  2. Probability of at least r successes in n trials \[=\sum\limits_{n=r}^{r}{^{n}{{C}_{r}}.{{p}^{r}}.{{q}^{n-r}}}\]
  3. Probability of having first success at the rth trial\[=p.{{q}^{r-1}}.\]

 

  • Important results for binomial distribution

For a binomial distribution B(n,p)

(i) Mea nor expected no. of successes, \[\mu =np\]

(ii) Variance, \[\sigma =npq\]

(iii) Standard deviation, \[s=\sqrt{npq}\]

 

Solved Example

  1. A die is thrown 6 times. If getting an odd number is a success, then what is the probability of (i) 5 successes (ii) at least 5 successes.

Sol. In throwing a dice, odd numbers will be 1, 3 and 5.

\[\therefore \] No. of favourable cases = 3

No. of exhaustive cases = 6

P = probability of getting an odd number \[=\frac{3}{6}=\frac{1}{2}\]

 

\[\Rightarrow \] q = probability of not getting an odd number \[=1-\frac{1}{2}=\frac{1}{2}\]

\[P(X=r)={}^{n}{{C}_{r}}.{{p}^{r}}.{{q}^{n-r}}={}^{6}{{C}_{r}}.{{p}^{r}}.{{q}^{6-r}}\]

\[={}^{6}{{C}_{r}}.{{\left( \frac{1}{2} \right)}^{r}}.{{\left( \frac{1}{2} \right)}^{6-r}}={}^{6}{{C}_{r}}.{{\left( \frac{1}{2} \right)}^{6}}=\frac{1}{64}.{}^{6}{{C}_{r}}\]

(i)  Probability of 5 successes\[=P(x=5)=\frac{1}{64}{}^{6}{{C}_{5}}=\frac{1}{64}\times 6=\frac{3}{32}\]

(ii) Probability of at least 5 successes

                                                            \[=P(X=5)+P(X=6)\]

\[=\frac{1}{64}{}^{6}{{C}_{5}}+\frac{1}{64}\times {}^{6}{{C}_{6}}=\frac{3}{32}+\frac{1}{64}\times 1\frac{6+1}{64}=\frac{7}{64}\]

(iii) Probability at most 5 successes

\[=P(X=0)+P(X=1)+P(X=2)+P(X=3)+P(X=4)+P(X=5)\]

\[=1-P(X=6)=1-\frac{1}{64}=\frac{63}{64}\]

 

  1. A dice is rolled 20 times. If getting a number greater then 4 is considered as a success, then find the mean, variance and standard deviation of the success.

Sol. In rolling a dice, the numbers greater than 4 are 5 or 6.

So, probability of success, \[p=\frac{2}{6}=\frac{1}{3}\]

and probability of failure, \[q=1-\frac{1}{3}=\frac{2}{3}\]

\[\therefore \] Mean\[=np=20\times \frac{1}{3}=\frac{20}{3}\]

Variance, \[{{\sigma }^{2}}=npq=20\times \frac{1}{3}\times \frac{2}{3}=\frac{40}{9}\]

Standard deviation \[=\sqrt{npq}=\sqrt{20\times \frac{1}{3}\times \frac{2}{3}}=\sqrt{\frac{40}{9}}=\frac{2}{3}\sqrt{10}\]

 

  1. A fair coin is tossed 100 times. Find the probability p that head occurs less than 45 times.

Sol. This is the bionomial experiment \[B(n,p)=\]with \[n=100\]

\[\therefore p=0.5\]and \[1-0.5=0.5\]

Mean or expected no. of success \[=np=100\times 0.5=50\]

Variance, \[{{\sigma }^{2}}=npq=100\times \frac{1}{2}\times \frac{1}{2}=25\]

\[\therefore \sigma =\sqrt{25}=5\]

\[\therefore \] We seek \[{{B}_{p}}(k<45)={{B}_{p}}(k\le 44).\] or, Approximately, normal probability \[{{N}_{p}}(x\le 44.5)\]Transforming,\[a=44.5\]into standard unit

            \[\therefore {{z}_{1}}\frac{a-\mu }{6}=\frac{44.5-50.0}{5}=\frac{-5.5}{5}=-1.1\]

            \[\therefore \] Here \[{{z}_{1}}<0\]

            \[\therefore p={{B}_{p}}(k\le 44)\approx {{N}_{p}}(x\le 44.5)={{N}_{p}}(x=1.1)=0.5-f(1.1)\]

            \[=0.5-0.3643=6.1357\]

 

  1. A fair coin is tossed 6 times. If calls a head is considered as a success, then find the probability that

(a) exactly 2 heads occur

(b) at least 4 heads occurs.

(c) at least one head occur.

Sol. Here no of trials, \[n=6\]

Probability of success, \[p=\frac{1}{2}\]

\[\therefore \] probability of failure, \[q=1-\frac{1}{2}=\frac{1}{2}\]

(a) Probability of getting exactly 2 heads

\[=P(2)={}^{6}{{C}_{2}}{{\left( \frac{1}{2} \right)}^{2}}.{{\left( \frac{1}{2} \right)}^{6-2}}={}^{6}{{C}_{2}}{{\left( \frac{1}{2} \right)}^{6}}\]

\[=15\times \frac{1}{64}=\frac{15}{64}\]

 

(b) Probability of getting at least 4 heads \[=P(4)+P(5)+P(6)\]

\[={}^{6}{{C}_{4}}{{\left( \frac{1}{2} \right)}^{2}}.{{\left( \frac{1}{2} \right)}^{4}}+{}^{6}{{C}_{5}}{{\left( \frac{1}{2} \right)}^{5}}.\left( \frac{1}{2} \right)=15.{{\left( \frac{1}{2} \right)}^{6}}+6.{{\left( \frac{1}{2} \right)}^{6}}+1{{\left( \frac{1}{2} \right)}^{6}}+{}^{6}{{C}_{6}}{{\left( \frac{1}{2} \right)}^{0}}.{{\left( \frac{1}{2} \right)}^{6}}\]

\[={{\left( \frac{1}{2} \right)}^{6}}(15+6+1)=\frac{22}{64}=\frac{11}{32}\]

 

(c) The probability of getting no head (i.e. all failures) \[={{q}^{6}}={{\left( \frac{1}{2} \right)}^{6}}=\frac{1}{64}\]

\[\therefore \] Probability of 1 or more head

\[=1-{{q}^{4}}=1-\frac{1}{64}=\frac{63}{64}\]

 

  1. Two man A and B fire at a target. Suppose \[P(A)=\frac{1}{3},\]\[P(B)=\frac{1}{5},\] denote their proabilities of hitting the target. (Consider A and B be independent events). Find the probability that

(a) A does not hit the target                      (b) Both hit the target

(c) One of them hit the target                    (d) Neither hits the target

 

Sol. (a) P (A does not hit the target) \[=P(\bar{A})=P({{A}^{c}})=1-P(A)=1-\frac{1}{3}=\frac{2}{3}\]

(b) Since A and Bare independent events

\[\therefore \] P(A and B)\[=P(A\cap B)=P(A).P(B)=\frac{1}{3}.\frac{1}{5}=\frac{1}{15}\]

(c) Probability of one of them hit the target

= P(A or B) \[=P(A\cup B)\]

\[=P(A)+P(B)-P(A\cap B)\](By addition rule)

\[=\frac{1}{3}+\frac{1}{5}-\frac{1}{15}=\frac{8-1}{15}=\frac{7}{15}\]

(d) Probability of neither hits the target.

\[\therefore \] (Neither A nor B) \[=P\{{{(A\cup B)}^{c}}\}=1-P(A\cup B)=1-\frac{7}{15}=\frac{8}{15}\]

 

  1. Three fair coins – a penny, a nickel and a dime are tossed. Find the Probability p that they are all heads if (a) the penny is head (b) at least one of the coin is head (c) the dice is tails.

Sol. Sample Space, S = {HHH, HHT, HTT, HTH, THT, TTH, THH, TTT}

(a) E = penny is head = {HHH, HHT, HTH, HTT}

                                                             \[P(E)=\frac{1}{4}\]

(b) P(at least one of the coin is head)

                                                                \[=\frac{1}{7}\]

(c)

                                                                      \[P=0\]

 

  1. A fair coin is tossed untill a head or five tails occurs. Find the expected number E of tosses of the coin.

Sol. The sample space S consists of six points.

H, TH, TTH, TTTH, TTTTH, TTTTT

Their respective probabilities (independent trails) are

\[\frac{1}{2},{{\left( \frac{1}{2} \right)}^{2}},{{\left( \frac{1}{2} \right)}^{3}},{{\left( \frac{1}{2} \right)}^{4}},{{\left( \frac{1}{2} \right)}^{5}}\]

and

                                                 \[{{\left( \frac{1}{2} \right)}^{5}}\]

                                                       \[\therefore X(H)=1[X(S)\to\]

  1. of tosses in each outcome]

\[X(TH)=2\]

\[X(TTH)=3\]

\[X(TTTH)=4\]

\[X(TTTTH)=5\]

\[X(TTTTT)=5\]

\[\therefore P(1)=P(H)=\frac{1}{2}\]                    \[P(2)=P(\text{TH})={{\left( \frac{1}{2} \right)}^{2}}\]

\[P(3)=P(\text{TTH})={{\left( \frac{1}{2} \right)}^{3}}\]                        \[P(4)=P(\text{TTTH})={{\left( \frac{1}{2} \right)}^{4}}\]

\[P(5)=P(\text{TTTTH})={{\left( \frac{1}{2} \right)}^{5}}\]        \[P(5)=P(\text{TTTTT})={{\left( \frac{1}{2} \right)}^{5}}\]

\[E(X)=1.\left( \frac{1}{2} \right)+2.{{\left( \frac{1}{2} \right)}^{2}}+3{{\left( \frac{1}{2} \right)}^{3}}+4{{\left( \frac{1}{2} \right)}^{4}}+5{{\left( \frac{1}{2} \right)}^{5}}+5{{\left( \frac{1}{2} \right)}^{5}}\]

\[=\left( \frac{1}{2} \right)+2\left( \frac{1}{4} \right)+3\left( \frac{1}{8} \right)+4\left( \frac{1}{16} \right)+5\left( \frac{1}{32} \right)+5\left( \frac{1}{32} \right)\]

\[=\frac{31}{16}\approx 1.9\]

 

 

 


You need to login to perform this action.
You will be redirected in 3 sec spinner