• Skip to secondary menu
  • Skip to main content
  • Skip to primary sidebar
  • My Store
  • Glossary
  • Home
  • About Me
  • Contact Me

Statistics By Jim

Making statistics intuitive

  • Graphs
  • Basics
  • Hypothesis Testing
  • Regression
  • ANOVA
  • Probability
  • Time Series
  • Fun

Bernoulli Distribution: Uses, Formula & Example

By Jim Frost Leave a Comment

What is the Bernoulli Distribution?

The Bernoulli distribution is a discrete probability distribution that models a binary outcome for one trial. Use it for a random variable that can take one of two outcomes: success (k = 1) or failure (k = 0), much like a coin toss. Statisticians refer to these trials as Bernoulli trials.

Bernoulli trials deal with events having clear-cut outcomes that you can answer with yes-or-no answers. Consider the following Bernoulli trial examples:

  • Will the next coin toss show heads?
  • Will the die roll produce a 6?
  • Will a specific song play next on shuffle?
  • Will I win the next lottery?

The Bernoulli distribution is a special case of the Binomial distribution because it assesses only one trial (n = 1). Conversely, the Binomial distribution extends that by tallying successes over multiple trials (n  > 1).

The Bernoulli distribution is one of the simpler discrete distributions. It is a starting point for more complex distributions that model a series of trials, such as the binomial, geometric, and negative binomial distributions—critical players in statistics.

Learn more about Probability Distributions.

Bernoulli Distribution formula

The Bernoulli distribution uses the following notation:

  • p = the probability of success.
  • q = the probability of failure (1 – p).

The probability of success and failure must sum to 1 because each trial must always end with a success or failure: p + q = 1. Therefore, using simple algebra, the probability of failure (q) equals 1 – p.

A Probability Mass Function (PMF) describes the distribution of outcomes for a discrete probability function like the Bernoulli distribution. Typically, the outcomes are denoted as k = 1 for a success and k = 0 for a failure.

The PMF below describes the probability of each outcome in the Bernoulli distribution:

Probability mass function for the Bernoulli distribution.

That’s a fancy way of saying that the likelihood of success is p and the chance of failure is 1 – p.

The formulas for the mean and variance of the Bernoulli distribution are also simple.

  • Mean = p
  • Variance = p(1 – p) = pq

The variance of the Bernoulli distribution always falls between 0 and 0.25, inclusive. It is lowest when p = 0 or 1 and the highest when p = 0.5.

Example

Imagine testing a lightbulb from a new manufacturer. Does this lightbulb work? Either it does (success) or does not (failure). If the probability that the lightbulb works is p = 0.7, you know the probability of failure is q = 0.3. The graph below illustrates this function.

Graph displaying the Bernoulli distribution.

Next, we’ll calculate the mean and variance using the previous Bernoulli distribution formulas.

  • Mean = p = 0.7
  • Variance: 0.7(1 – 0.7) = 0.7 * 0.3 = 0.21

Binomial vs Bernoulli Distribution: A Quick Comparison

Finally, let’s dive into a side-by-side comparison of the Binomial and Bernoulli distributions to understand their unique characteristics.

Bernoulli Binomial
Use Single trial Multiple trials
Notation p n (trials), p (success probability)
Mean Formula p np
Variance Formula p(1 – p) np(1 – p)
Example A lightbulb working or not. Number of successes in 10 lightbulb tests.

In summary, the Bernoulli distribution gives us a simple yet powerful tool to understand binary outcomes. Whether you’re flipping coins or testing lightbulbs, these distributions can shed light (pun intended) on the probabilities!

Now that you’re familiar with this foundational distribution build on it by learning about the more complex distributions for a series of Bernoulli trials:

  • Binomial: Number of successes in n trials.
  • Geometric: number of failures before the first success.
  • Negative binomial: number of failures before the Xth success.
  • Hypergeometric: Like the binomial but for small populations without replacement.

Share this:

  • Tweet

Related

Filed Under: Probability Tagged With: distributions

Reader Interactions

Comments and QuestionsCancel reply

Primary Sidebar

Meet Jim

I’ll help you intuitively understand statistics by focusing on concepts and using plain English so you can concentrate on understanding your results.

Read More...

Buy My Introduction to Statistics Book!

Cover of my Introduction to Statistics: An Intuitive Guide ebook.

Buy My Hypothesis Testing Book!

Cover image of my Hypothesis Testing: An Intuitive Guide ebook.

Buy My Regression Book!

Cover for my ebook, Regression Analysis: An Intuitive Guide for Using and Interpreting Linear Models.

Subscribe by Email

Enter your email address to receive notifications of new posts by email.

    I won't send you spam. Unsubscribe at any time.

    Top Posts

    • How to Interpret P-values and Coefficients in Regression Analysis
    • F-table
    • How To Interpret R-squared in Regression Analysis
    • Z-table
    • How to do t-Tests in Excel
    • How to Find the P value: Process and Calculations
    • Weighted Average: Formula & Calculation Examples
    • T-Distribution Table of Critical Values
    • Cronbach’s Alpha: Definition, Calculations & Example
    • Multicollinearity in Regression Analysis: Problems, Detection, and Solutions

    Recent Posts

    • Longitudinal Study: Overview, Examples & Benefits
    • Correlation vs Causation: Understanding the Differences
    • One Way ANOVA Overview & Example
    • Observational Study vs Experiment with Examples
    • Goodness of Fit: Definition & Tests
    • Binomial Distribution Formula: Probability, Standard Deviation & Mean

    Recent Comments

    • Jim Frost on Joint Probability: Definition, Formula & Examples
    • Harmeet on Joint Probability: Definition, Formula & Examples
    • kafia on Cronbach’s Alpha: Definition, Calculations & Example
    • Jim Frost on How to Interpret P-values and Coefficients in Regression Analysis
    • Jim Frost on Convenience Sampling: Definition & Examples

    Copyright © 2023 · Jim Frost · Privacy Policy