Introduction

Daniel Bernoulli, a renowned Swiss mathematician and physicist in the Bernoulli family, famous for his pioneering work in fluid mechanics, probability and statistics.
- Wikipedia

Random Experiment (RE)

Any experiment where we cannot predict the outcome with certainty.

Examples of Valid RE:
  • Pick a Red ball from a bag of Red and Black balls.
  • Pick a Black ball from a bag of Red and Black balls.
Examples of Invalid RE:
  • Pick a White ball from a bag of Red and Black balls.
  • Pick a Black ball from a bag of Black balls.

Understanding Random Experiments

Let us define two RE of tossing a coin ten times and rolling dice 5 times as below.

RE1: Coin Toss 10 times

We toss a coin 10 times. It can be either Head or Tails. We win for Heads and lose for Tails.

RE2: Roll Dice 5 times

We roll dice 5 times. It can be either Odd or Even. We win for Even and lose for Odd.

Outcome

The Outcome is a possible result of an experiment. Each possible outcome is unique and mutually exclusive.
For RE1 and RE2, there are two outcomes - favourable and unfavourable.

Trials

If the favourable outcome is represented with one (1) and Unfavourable by zero (0), we can represent these repeated events as sequence of 1's and 0's (1010).
Repeated events represented as a sequence of 1's and 0's are called Trials.

Random Variable

A Random Variable is a set of possible values from a random experiment.

Example:
Coin Toss result in Heads or Tails. Let's give Heads=1 and Tails=0, and we can define our random variable "X" as:

X = {0, 1}

Note:
We could choose Heads=20 and Tails=30 or other values if we want! It is our choice.

Sample Space

The set of possible values is called the Sample Space.
The Sample Space for rolling dice is {1, 2, 3, 4, 5, 6}

Bernoulli Trials

A trial is called as Bernoulli Trial, only if it follows conditions.

Conditions for a trial to be Bernoulli Trial

  1. Finite
    A trial has a finite number of events. In other words, the event repeats a finite number of times.
    In the above RE1 and RE2, the coin and Dice are rolled 10 times and 5 times respectively.
  2. Independent
    A trial is independent of the previous trial.
    In the above RE1 and RE2, each coin toss or rolling of dice is independent of the outcome of previous trials, hence, independent.
  3. Two Outcomes
    Every Bernoulli Trial has two outcomes.
    In the above RE1 and RE2, we defined success and failures for coin as heads and dice trials as even number.
  4. Same Probability
    The probability remains same in all trials.
    In the above RE1 and RE2, the probability of success for each coin toss and dice roll remains same 0.5

Hence, we can say trials by our Random Experiments, RE1 and RE2 are Bernoulli Trials.

More Scenarios
6 balls drawn from urn 7 red and 9 black, replaced and not replaced. Find if Bernoulli or not?

Bernoulli Test Replaced Not Replaced
Finite Yes Yes
Independent Yes No
Two Outcomes Yes Yes
Same Probability Yes No

Bernoulli Trial Test for taking out ball from the urn

Example

Let's consider a scenario of a coin tossed 10 times. Find probability to get exactly 6 heads, at least 6 heads and at most 6 heads.

We have three scenarios to get exactly 6 heads, at least 6 heads and at most 6 heads.
Now, is it Bernoulli Trial?

Bernoulli Test Coin Toss Comments
Finite Yes 10 events
Independent Yes not dependent on previous outcomes
Two Outcomes Yes Success as Head and Failure as Tail
Same Probability Yes Success as Head (p) = \({\frac{1}{2}}\)

Bernoulli Trial Test for the coin toss

In the mentioned scenario, we define Success and Random Variable X as

\begin{align} Success & = Getting \space Heads \newline Random \space Variable \space X & = Number \space of \space Success (or\space getting \space Heads) \end{align}

We can define Probability of success of getting k heads as

\begin{align} & P(X=k) = {n\choose k} (p)^k (1-p)^{n-k} \newline &\space for \space k \space number \space of \space success, \space where \newline & X = Number\space of\space Success\newline &n = number \space of \space events \space in \space the \space trial \space =\space 10\newline &p = probability \space of \space occurence =\space \frac{1}{2}\newline \end{align} Here the term \({(p)^k (1-p)^{n-k}}\) is the probability of a specific sequence k out of n, where as the term \({n\choose k}\) gives the number of combinations of such sequences.

Exactly 6 heads:

\begin{align} & P(X=6) = {10\choose 6} \left(\frac{1}{2}\right)^6 \left(1-\frac{1}{2}\right)^{10-6} \newline & P(X=6) = {10\choose 6} \left(\frac{1}{2}\right)^6 \left(\frac{1}{2}\right)^4 \newline \end{align}

At least 6 heads:

\begin{align} & P(X\ge6) = P(X=6) + P(X=7) + P(X=8) + P(X=9) + P(X=10) \end{align}

At most 6 heads:

\begin{align} & P(X\le6) = P(X=1) + P(X=2) + P(X=3) + P(X=4) + P(X=5) + P(X=6) \newline & P(X\le6) = 1 - \big(P(X=7) + P(X=8) + P(X=9) + P(X=10)\big) \newline \end{align}

Binomial Distribution

Let us see Probability distribution of above example of Bernoulli Trial of tossing 10 coins where Random Variable X defined by the number of success (or getting heads).

X 0 1 .. k .. n
P(X) P(X=0) P(X=1) .. P(X=k) .. P(X=n)
P(X) $${{n\choose 0} (p)^0 (1-p)^{n-0}}$$ $${{n\choose 1} (p)^1 (1-p)^{n-1}}$$ .. $${{n\choose k} (p)^k (1-p)^{n-k}}$$ .. $${{n\choose n} (p)^k (1-p)^{n-n}}$$

Probability Distribution of Coin Toss Bernoulli Trial

If we look carefully, we find that it is actually, Binomial expansion and called Binomial Distribution.

\begin{align} Binomial \space Expansion & = ((1-p) + p)^n \newline & = {n\choose 0} (p)^0 (1-p)^{n-0} + {n\choose 1} (p)^1 (1-p)^{n-1} + .. + {n\choose n} (p)^k (1-p)^{n-n} \newline \end{align}

A binomial Distribution with n Bernoulli Trials and probability of success p, then it is denoted as
\begin{align}
& B_{X}(n,p), \space where \newline
&n = number \space of \space events \space in \space the \space trial \newline
&p = probability \space of \space occurence \newline
&X = number \space of \space success \newline
\end{align}