This article introduces basic mathematical concepts in probability. Future articles will discuss different aspects, including several paradoxical situations involving probabilities. For those who can’t wait, Solve the Monty Hall Problem using Logic and Mathematics.
Probability, Statistics or Likelihood?
In mathematics, “probability” is the study of how likely it is for some specific outcome to occur as the result of an event. This is expressed either as a percentage from zero to 100%, or as a number between 0.0 and 1.0 inclusive.
“Statistics” is the analysis of the events ruled by probabilities. Often the statistician is concerned with the distribution of probabilities. In the far future, we may delve into statistics.
A related word is “likelihood“, which refers to the odds of the outcomes of past events. By contrast, “probability” refers to future events.
Some examples might help:
- If you plan to flip a coin to make a decision, the probability of the “heads” outcome is 0.5, or 50%. Afterward, when you explain why you watched a “Headhunter” movie rather than playing “Pin the Tail on the Donkey”, you would say the “likelihood” of either activity was 50%. From the statistical point of view, a coin toss is well modeled as a Bournoulli distribution with probability parameter of 0.5.
It seems likely that the first people to study probability were motivated by gambling. This particular article will not prepare anyone to win at a casino. However, the foundation to avoid losing is to realize that the professionals already know the odds.
Discrete and Continuous Probability
The act of tossing a coin for a “head” or “tail” is an example of a discrete probability distribution. Other discrete outcomes may be generated by rolling dice or selecting playing cards or domino tiles.
Although a coin toss has only two outcomes, other discrete distributions might have a countably infinite number of outcomes. (A “countably infinite” set can be placed in a one-to-one relationship with positive integers.) A simple example of a discrete infinite distribution would model the probability of finding “runs” of three heads in a row while repeatedly tossing a coin.
Continuous probability distributions also exist. Examples include predicting how long a light bulb might shine before it wears out, or the amount of rainfall collected at a location over a given period of time.
The first articles will only deal with discrete probabilities.
Defining a Sample Space
We need some more definitions before making progress. These are simplified, rather than properly formal.
- Event: One “event” gives only one outcome: an example is a single roll of the dice. Often we are interested in a set of events, such as rolling dice five times to determine which combinations of numbers are seen most frequently.
- Sample Space: The “sample space” of such a set of events is itself a set, containing all the possible outcomes. One says that an event is in a sample space.
The Probability Axioms
Mathematicians like to begin with axioms: statements that seem true and upon which a self-consistent mathematical topic can be constructed. Here, we begin with the Probability Axioms.
Define a discrete sample space, S, with N elements. ‘N‘ could be a finite number or a countably infinity, such as the cardinality of the set of integers.
We then say that the jth event E[j] is in S, and that S is the union of the E[j] for all j from one to N. Each event is considered a set, and the whole sample space is also a set.
There is also a probability, P(E[j]), for each event. As well, there is a probability, P(S), for the whole sample space. A computer programmer would consider ‘P()’ to be a function with a set as the input, that returns the numeric value of the probability.
Decoding Science. One article at a time.