Why do you see it here? Because I need it.

**Probability of an event**

Any subset $E$ of the sample space is known as **an event.**

That is, an event is a set consisting of possible outcomes of the experiment. If the outcome of the experiment is contained in , then we say that $E$ has occurred.

**Axioms of probability**

For each event $E$, we denote $P(E)$ as the probability of event $E$ occurring.

The probability that at least one of the elementary events in the sample space will occur is $1$.

Every probability value is between $0$ and $1$ included.

For any sequence of mutually exclusive events $E_{i}$ we have:

$P(i=1⋃n E_{i})=i=1∑n P(E_{i})$**Combinatorics**

**A permutation** is an arrangement of $k$ objects from a pool of $n$ objects, in a given order. The number of such arrangements is:

**A combination** is an arrangement of $k$ objects from a pool of $n$ objects, where the order does not matter. The number of such arrangements is:

We note that for $0≤k≤n$ *,* we have $P(n,r)≥C(n,r)$

**Conditional probabilities**

**Conditional probability** is the **probability** of one event occurring with some relationship to one or more other events.

**Independence.** Two events $A$ and $B$ are independent if and only if we have:

**Law of total probability.** Given an event $A$, with known conditional probabilities given any of the $B_{i}$ events, each with a known probability itself, what is the total probability that $A$ will happen? The answer is

**Bayes’ rule.** For events $A$ and $B$ such that $P(B)>0$, we have

**Expectation and Moments of the Distribution**

The formulas will be explicitly detailed for the discrete **(D)** and continuous **(C)** cases.

**Expected value.** The expected value of a random variable, also known as the mean value or the first moment, is often noted $E[X]$ and is the value that we would obtain by averaging the results of the experiment infinitely many times.

**Variance.** The variance of a random variable, often noted $Var[X]$, is a measure of the spread of its distribution function.

**Some Inequalities**

**Markov’s inequality.** Let $X$ be a random variable and $a>0$, then

**Chebyshev’s inequality**. Let $X$ be a random variable with expected value $μ$, then