What is a Probability Mass Function?
A probability mass function (PMF) is a mathematical function that calculates the probability a discrete random variable will be a specific value. PMFs also describe the probability distribution for the full range of values for a discrete variable. A discrete random variable can take on a finite or countably infinite number of possible values, such as the number of heads in a series of coin flips or the number of customers who visit a store on a given day.
Probability mass functions find the likelihood of a particular outcome. For example, we can use a PMF to calculate the probability of getting exactly three heads in a series of coin flips. This process involves plugging the value into the correct probability mass function and calculating the likelihood.
Using a PMF to calculate the likelihoods for all possible values of the discrete variable produces its probability distribution.
Read on to learn more about using probability mass functions, work through an example, and learn about the various types.
Learn more about Random Variables: Discrete & Continuous.
PMF vs PDF
PMFs and probability density functions (PDFs) both find likelihoods for random variables and can produce probability distributions. The table below summarizes their differences.
|Use for discrete random variables.
|Use for continuous random variables.
|Finds the probability that the variable can take on one of its discrete values.
|Finds the probability that the variable will lie within a range of values.
Probability Mass Function Notation and Details
For a discrete random variable, each possible value must have a non-zero likelihood. Additionally, the probabilities for all possible values must sum to one. Because the total chance is 1, one of the values must occur for each opportunity.
For example, the chance of rolling a particular number on a die is 1/6. The total probability for all six outcomes equals one. For a single roll of a die, you inevitably obtain one of the possible values. Or, when you flip a coin five times, you’ll always get 0 to 5 heads because each outcome has a non-zero chance and they sum to 1.
If the probability mass function has a finite number of values, you can list all the outcomes and their likelihoods in a table, producing its probability distribution, which I’ll show in the next section.
The standard notation for a probability mass function is P(X = x) = f (x). Where:
- X is the discrete random variable.
- x is one of the possible discrete values.
- f (x) is a mathematical function that calculates the likelihood for the value of x.
So, putting it all together, P(X = x) = f (x) means: The chance of variable X assuming the specific value of x equals f (x).
For example, when considering three heads in a series of coin tosses, the PMF notation is: P(Heads = 3) = f (x).
But what function do we use to perform the calculation?
That depends on the discrete variable.
I’ll work through an example in the next section and cover several types of PMF after that.
Probability Mass Function Example
The coin flip scenario requires the binomial distribution because it calculates the probability of exactly x events occurring in n trials. For example, it can find the likelihood of 3 heads occurring in five coin flips.
F (x) = nCx * px * (1 – p)n – x
- n = number of trials.
- x = number of successes
- p = probability of success.
- nCx = number of combinations without repetition.
For this example, I’ll skip the calculations. However, read my post about the Binomial Distribution for a more in-depth look and a worked example.
The following table depicts the likelihood of obtaining each number of heads from 0 to 5 when you flip a coin five times.
|Number of Heads
Notice how we obtain non-zero likelihoods for all outcomes, and they sum to 1 exactly.
You can also depict this information graphically to see the distribution.
Types of PMFs
Choosing the correct probability mass function depends on the nature of the discrete variable and what you need to model. Fortunately, various PMFs exist for the following distributions among others:
- Bernoulli for a single trial.
- Binomial for the number of successes within a set of trials, such as heads in a series of coin tosses.
- Poisson for count data, such as the count of store patrons per day.
- Uniform for events with equal probabilities, such as rolling a die.
- Negative binomial for the number of failures before a specified number of successes occur, such as the number of attempts before making three successful sales.
- Geometric for the number of trials before a success occurs, such as the number of tosses before getting heads on a coin.
- Hypergeometric for sampling without replacement, such as the number of red balls drawn from a jar of mixed colored balls, where the proportion of red balls is known.
Click the links above to learn in-depth about the various probability mass functions!
Note that probability mass functions find the likelihood for X = x. Use a cumulative distribution function to find the probability of X ≤ x.
Finally, learn how to determine whether a discrete distribution is appropriate for your data by reading my post Goodness-of-Fit Tests for Discrete Distributions.