Why do you see it here? Because I need it.

Probability of an event

Any subset of the sample space is known as an event.

That is, an event is a set consisting of possible outcomes of the experiment. If the outcome of the experiment is contained in , then we say that has occurred.


Axioms of probability

For each event , we denote as the probability of event occurring.

The probability that at least one of the elementary events in the sample space will occur is .

Every probability value is between and included.

For any sequence of mutually exclusive events we have:


Combinatorics

A permutation is an arrangement of objects from a pool of objects, in a given order. The number of such arrangements is:

A combination is an arrangement of objects from a pool of objects, where the order does not matter. The number of such arrangements is:

We note that for , we have


Conditional probabilities

Conditional probability is the probability of one event occurring with some relationship to one or more other events.

Independence. Two events and are independent if and only if we have:

Law of total probability. Given an event , with known conditional probabilities given any of the events, each with a known probability itself, what is the total probability that will happen? The answer is

Bayes’ rule. For events and such that , we have


Expectation and Moments of the Distribution

The formulas will be explicitly detailed for the discrete (D) and continuous (C) cases.

Expected value. The expected value of a random variable, also known as the mean value or the first moment, is often noted and is the value that we would obtain by averaging the results of the experiment infinitely many times.

Variance. The variance of a random variable, often noted , is a measure of the spread of its distribution function.


Some Inequalities

Markov’s inequality. Let be a random variable and , then

Chebyshev’s inequality. Let be a random variable with expected value , then