• Skip to secondary menu
  • Skip to main content
  • Skip to primary sidebar
  • My Store
  • Glossary
  • Home
  • About Me
  • Contact Me

Statistics By Jim

Making statistics intuitive

  • Graphs
  • Basics
  • Hypothesis Testing
  • Regression
  • ANOVA
  • Probability
  • Time Series
  • Fun

Probability Mass Function: Definition, Uses & Example

By Jim Frost Leave a Comment

What is a Probability Mass Function?

A probability mass function (PMF) is a mathematical function that calculates the probability a discrete random variable will be a specific value. PMFs also describe the probability distribution for the full range of values for a discrete variable. A discrete random variable can take on a finite or countably infinite number of possible values, such as the number of heads in a series of coin flips or the number of customers who visit a store on a given day.

Probability mass functions find the likelihood of a particular outcome. For example, we can use a PMF to calculate the probability of getting exactly three heads in a series of coin flips. This process involves plugging the value into the correct probability mass function and calculating the likelihood.

Using a PMF to calculate the likelihoods for all possible values of the discrete variable produces its probability distribution.

Read on to learn more about using probability mass functions, work through an example, and learn about the various types.

Learn more about Random Variables: Discrete & Continuous.

PMF vs PDF

PMFs and probability density functions (PDFs) both find likelihoods for random variables and can produce probability distributions. The table below summarizes their differences.

PMF PDF
Use for discrete random variables. Use for continuous random variables.
Finds the probability that the variable can take on one of its discrete values. Finds the probability that the variable will lie within a range of values.

Related posts: Discrete vs. Continuous, Probability Density Functions (PDFs), and Probability Distributions.

Probability Mass Function Notation and Details

For a discrete random variable, each possible value must have a non-zero likelihood. Additionally, the probabilities for all possible values must sum to one. Because the total chance is 1, one of the values must occur for each opportunity.

For example, the chance of rolling a particular number on a die is 1/6. The total probability for all six outcomes equals one. For a single roll of a die, you inevitably obtain one of the possible values. Or, when you flip a coin five times, you’ll always get 0 to 5 heads because each outcome has a non-zero chance and they sum to 1.

If the probability mass function has a finite number of values, you can list all the outcomes and their likelihoods in a table, producing its probability distribution, which I’ll show in the next section.

The standard notation for a probability mass function is P(X = x) = f (x). Where:

  • X is the discrete random variable.
  • x is one of the possible discrete values.
  • f (x) is a mathematical function that calculates the likelihood for the value of x.

So, putting it all together, P(X = x) = f (x) means: The chance of variable X assuming the specific value of x equals f (x).

For example, when considering three heads in a series of coin tosses, the PMF notation is: P(Heads = 3) = f (x).

But what function do we use to perform the calculation?

That depends on the discrete variable.

I’ll work through an example in the next section and cover several types of PMF after that.

Probability Mass Function Example

The coin flip scenario requires the binomial distribution because it calculates the probability of exactly x events occurring in n trials. For example, it can find the likelihood of 3 heads occurring in five coin flips.

F (x) = nCx * px * (1 – p)n – x

Where:

  • n = number of trials.
  • x = number of successes
  • p = probability of success.
  • nCx = number of combinations without repetition.

For this example, I’ll skip the calculations. However, read my post about the Binomial Distribution for a more in-depth look and a worked example.

The following table depicts the likelihood of obtaining each number of heads from 0 to 5 when you flip a coin five times.

Number of Heads Probability
0 0.03125
1 0.15625
2 0.31250
3 0.31250
4 0.15625
5 0.03125
Total Probability 1.00000

Notice how we obtain non-zero likelihoods for all outcomes, and they sum to 1 exactly.

You can also depict this information graphically to see the distribution.

Graph displaying a probability mass function for the die example.

Types of PMFs

Choosing the correct probability mass function depends on the nature of the discrete variable and what you need to model. Fortunately, various PMFs exist for the following distributions among others:

  • Bernoulli for a single trial.
  • Binomial for the number of successes within a set of trials, such as heads in a series of coin tosses.
  • Poisson for count data, such as the count of store patrons per day.
  • Uniform for events with equal probabilities, such as rolling a die.
  • Negative binomial for the number of failures before a specified number of successes occur, such as the number of attempts before making three successful sales.
  • Geometric for the number of trials before a success occurs, such as the number of tosses before getting heads on a coin.
  • Hypergeometric for sampling without replacement, such as the number of red balls drawn from a jar of mixed colored balls, where the proportion of red balls is known.

Click the links above to learn in-depth about the various probability mass functions!

Note that probability mass functions find the likelihood for X = x. Use a cumulative distribution function to find the probability of X ≤ x.

Finally, learn how to determine whether a discrete distribution is appropriate for your data by reading my post Goodness-of-Fit Tests for Discrete Distributions.

Share this:

  • Tweet

Related

Filed Under: Probability Tagged With: distributions, graphs

Reader Interactions

Comments and QuestionsCancel reply

Primary Sidebar

Meet Jim

I’ll help you intuitively understand statistics by focusing on concepts and using plain English so you can concentrate on understanding your results.

Read More...

Buy My Introduction to Statistics Book!

Cover of my Introduction to Statistics: An Intuitive Guide ebook.

Buy My Hypothesis Testing Book!

Cover image of my Hypothesis Testing: An Intuitive Guide ebook.

Buy My Regression Book!

Cover for my ebook, Regression Analysis: An Intuitive Guide for Using and Interpreting Linear Models.

Subscribe by Email

Enter your email address to receive notifications of new posts by email.

    I won't send you spam. Unsubscribe at any time.

    Top Posts

    • How To Interpret R-squared in Regression Analysis
    • How to Interpret P-values and Coefficients in Regression Analysis
    • Placebo Effect Overview: Definition & Examples
    • Mean, Median, and Mode: Measures of Central Tendency
    • Z-table
    • Cronbach’s Alpha: Definition, Calculations & Example
    • Weighted Average: Formula & Calculation Examples
    • F-table
    • Bernoulli Distribution: Uses, Formula & Example
    • Multicollinearity in Regression Analysis: Problems, Detection, and Solutions

    Recent Posts

    • Bernoulli Distribution: Uses, Formula & Example
    • Placebo Effect Overview: Definition & Examples
    • Randomized Controlled Trial (RCT) Overview
    • Prospective Study: Definition, Benefits & Examples
    • T Test Overview: How to Use & Examples
    • Wilcoxon Signed Rank Test Explained

    Recent Comments

    • Jim Frost on Cronbach’s Alpha: Definition, Calculations & Example
    • John on Cronbach’s Alpha: Definition, Calculations & Example
    • Jim Frost on Multicollinearity in Regression Analysis: Problems, Detection, and Solutions
    • Thu Nguyen on Multicollinearity in Regression Analysis: Problems, Detection, and Solutions
    • Quang Dat on 7 Classical Assumptions of Ordinary Least Squares (OLS) Linear Regression

    Copyright © 2023 · Jim Frost · Privacy Policy