• Skip to secondary menu
  • Skip to main content
  • Skip to primary sidebar
  • My Store
  • Glossary
  • Home
  • About Me
  • Contact Me

Statistics By Jim

Making statistics intuitive

  • Graphs
  • Basics
  • Hypothesis Testing
  • Regression
  • ANOVA
  • Probability
  • Time Series
  • Fun

ANOVA Overview

What is ANOVA?

Analysis of variance (ANOVA) assesses the differences between group means. It is a statistical hypothesis test that determines whether the means of at least two populations are different. At a minimum, you need a continuous dependent variable and a categorical independent variable that divides your data into comparison groups to perform ANOVA.

Researchers commonly use ANOVA to analyze designed experiments. In these experiments, researchers use randomization and control the experimental factors in the treatment and control groups. For example, a product manufacturer sets the time and temperature settings in its process and records the product’s strength. ANOVA analyzes the differences in mean outcomes stemming from these experimental settings to estimate their effects and statistical significance. These designed experiments are typically orthogonal, which provides important benefits. Learn more about orthogonality.

Additionally, ANOVA is useful for observational studies. For example, a researcher can observe outcomes for several education methods and use ANOVA to analyze the group differences. However, as with any observational study, you must be careful about the conclusions you draw.

The term "analysis of variance" originates from how the analysis uses variances to determine whether the means are different. ANOVA works by comparing the variance of group means to the variance within groups. This process determines if the groups are part of one larger population or separate populations with different means. Consequently, even though it analyzes variances, it actually tests means! To learn more about this process, read my post, The F-test in ANOVA.

The fundamentals of ANOVA tests are relatively straightforward because it involves comparing means between groups. However, there are an array of elaborations. In this post, I start with the simpler forms and explain the essential jargon. Then I’ll broadly cover the more complex methods.

Simplest Form and Basic Terms of ANOVA Tests

The simplest type of ANOVA test is one-way ANOVA. This method is a generalization of t-tests that can assess the difference between more than two group means.

Example of a boxplot that displays scores by teaching method.
ANOVA tells you whether the differences between group means are statistically significant.

Statisticians consider ANOVA to be a special case of least squares regression, which is a specialization of the general linear model. All these models minimize the sum of the squared errors.

Related post: The Mean in Statistics

Factors and Factor Levels

To perform the most basic ANOVA, you need a continuous dependent variable and a categorical independent variable. In ANOVA lingo, analysts refer to “categorical independent variables” as factors. The categorical values of a factor are “levels.” These factor levels create the groups in the data. Factor level means are the means of the dependent variable associated with each factor level.

For example, your factor might have the following three levels in an experiment: Control, Treatment 1, and Treatment 2. The ANOVA test will determine whether the mean outcomes for these three conditions (i.e., factor levels) are different.

Related posts: Independent and Dependent Variables and Control Groups in Experiments

Factor Level Combinations

ANOVA allows you to include more than one factor. For example, you might want to determine whether gender and college major correspond to differences in income.

Gender is one factor with two levels: Male and Female. College major is another factor with three levels in our fictional study: Statistics, Psychology, and Political Science.

The combination of these two factors (2 genders X 3 majors) creates the following six groups:

  • Male / Statistics
  • Female / Statistics
  • Male / Psychology
  • Female Psychology
  • Male / Political Science
  • Female / Political Science

 

These groups are the factor level combinations. ANOVA determines whether the mean incomes for these groups are different.

Factorial ANOVA

ANOVA allows you to assess multiple factors simultaneously. Factorial ANOVA are cases where your data includes observations for all the factor level combinations that your model specifies. For example, using the gender and college major model, you are performing factorial ANOVA if your dataset includes income observations for all six groups.

By assessing multiple factors together, factorial ANOVA allows your model to detect interaction effects. This ability makes multiple factor ANOVA much more efficient. Additionally, evaluating a single factor at a time conceals interaction effects. Single-factor analyses tend to produce inconsistent results when interaction effects exist in an experimental area because they cannot model the interaction effects.

Analysts frequently use factorial ANOVA in experiments because they efficiently test the main and interaction effects for all experimental factors.

Related post: Understanding Interaction Effects

Interpreting the ANOVA Test

ANOVA assesses differences between group means.

Suppose you compare two new teaching methods to the standard practice and want to know if the average test scores for the methods are different. Your factor is Teaching Method, and it contains the following three levels: Standard, Method A, and Method B.

The factor level means are the mean test score associated with each group.

ANOVAs evaluate the differences between the means of the dependent variable for the factor level combinations. The hypotheses for the ANOVA test are the following:

  • Null Hypothesis: The group means are all equal.
  • Alternative Hypothesis: At least one mean is different.

 

When the p-value is below your significance level, reject the null hypothesis. Your data favor the position that at least one group mean is different from the others.

While a significant ANOVA result indicates that at least one mean differs, it does not specify which one. To identify which differences between pairs of means are statistically significant, you’ll need to perform a post hoc analysis.

Related post: How to Interpret P Values

General ANOVA Assumptions

ANOVA tests have the same assumptions as other linear models other than requiring a factor. Specifically:

  • The dependent variable is continuous.
  • You have at least one categorical independent variable (factor).
  • The observations are independent.
  • The groups should have roughly equal variances (scatter).
  • The data in the groups should follow a normal distribution.
  • The residuals satisfy the ordinary least squares assumptions.

 

While ANOVA assumes your data follow the normal distribution, it is robust to violations of this assumption when your groups have at least 15 observations.

ANOVA Designs and Types of Models

The simpler types of ANOVA test are relatively straightforward. However, ANOVA is a flexible analysis, and many elaborations are possible. I’ll start with the simple forms and move to the more complex designs. Click the links to learn more about each type and see examples of them in action!

One-way ANOVA

One-way ANOVA tests one factor that divides the data into at least two independent groups.

Learn about one-way ANOVA and how to perform and interpret an example using Excel.

Two-way ANOVA

Two-way ANOVA tests include two factors that divide the data into at least four factor level combinations. In addition to identifying the factors’ main effects, these models evaluate interaction effects between the factors.

Learn about two-way ANOVA and how to perform and interpret an example using Excel.

Analysis of Covariance (ANCOVA)

ANCOVA models include factors and covariates. Covariates are continuous independent variables that have a relationship with the dependent variable. Typically, covariates are nuisance variables that researchers cannot control during an experiment. Consequently, analysts include covariates in the model to control them statistically.

Learn more about Covariates: Definition and Uses.

Repeated measures ANOVA

Repeated measures designs allow researchers to assess participants multiple times in a study. Frequently, the subjects serve as their own controls and experience several treatment conditions.

Learn about repeated measures ANOVA tests and see an example.

Multivariate analysis of variance (MANOVA)

MANOVA extends the capabilities of ANOVA by assessing multiple dependent variables simultaneously. The factors in MANOVA can influence the relationship between dependent variables instead of influencing a single dependent variable.

There’s even a MANCOVA, which is MANOVA plus ANCOVA! It allows you to include covariates when modeling multiple independent variables.

Learn about MANOVA and see an example.

Crossed and Nested Models

ANOVA can model both crossed and nested factors.

Crossed factors are the more familiar type. Two factors are crossed when each level of a factor occurs with each level of the other factor. The gender and college major factors in the earlier example are crossed because we have all combinations of the factor levels. All levels of gender occur in all majors and vice versa.

A factor is nested in another factor when all its levels occur within only one level of the other factor. Consequently, the data do not contain all possible factor level combinations. For example, suppose you are testing bug spray effectiveness, and your factors are Brand and Product.

Nested factors in an ANOVA model.

In the illustration, Product is nested within Brand because each product occurs within only one level of Brand. There can be no combination that represents Brand 2 and Product A.

The combinations of crossed and nested factors within an ANOVA design can become quite complex!

General Linear Model

The most general form of ANOVA allows you to include all the above and more! In your model, you can have as many factors and covariates as you need, interaction terms, crossed and nested factors, along with specifying fixed, random, or mixed effects, which I describe below.

Types of Effects in ANOVA

Fixed effects

When a researcher can set the factor levels in an experiment, it is a fixed factor. Correspondingly, the model estimates fixed effects for fixed factors. Fixed effects are probably the type with which you’re most familiar. One-way and two-way ANOVA procedures typically use fixed-effects models. ANOVA models usually assess fixed using ordinary least squares.

For example, in a cake recipe experiment, the researcher sets the three oven temperatures of 350, 400, and 450 degrees for the study. Oven temperature is a fixed factor.

Random effects

When a researcher samples the factor levels from a population rather than setting them, it is a random factor. The model estimates random effects for them. Because random factors sample data from a population, the model must change how it evaluates their effects by calculating variance components.

For example, in studies involving human subjects, Subject is typically a random factor because researchers sample participants from a population.

Mixed effects

Mixed-effects models contain both fixed and random effects. Frequently, mixed-effects models use restricted maximum likelihood (REML) to estimate effects.

For example, a store chain wants to assess sales. The chain chooses five states for its study. State is a fixed factor. Within these states, the chain randomly selects stores. Store is a random factor. It is also nested within the State factor.

Scroll down to find more of my articles about ANOVA!

ANCOVA: Uses, Assumptions & Example

By Jim Frost Leave a Comment

What is ANCOVA?

ANCOVA, or the analysis of covariance, is a powerful statistical method that analyzes the differences between three or more group means while controlling for the effects of at least one continuous covariate. [Read more…] about ANCOVA: Uses, Assumptions & Example

Filed Under: ANOVA Tagged With: analysis example, assumptions, choosing analysis, interpreting results

Covariates: Definition & Uses

By Jim Frost 2 Comments

What is a Covariate?

Covariates are continuous independent variables (or predictors) in a regression or ANOVA model. These variables can explain some of the variability in the dependent variable.

That definition of covariates is simple enough. However, the usage of the term has changed over time. Consequently, analysts can have drastically different contexts in mind when discussing covariates. [Read more…] about Covariates: Definition & Uses

Filed Under: ANOVA Tagged With: conceptual, data types

How to do Two-Way ANOVA in Excel

By Jim Frost 30 Comments

Use two-way ANOVA to assess differences between the group means that are defined by two categorical factors. In this post, we’ll work through two-way ANOVA using Excel. Even if Excel isn’t your main statistical package, this post is an excellent introduction to two-way ANOVA. Excel refers to this analysis as two factor ANOVA. [Read more…] about How to do Two-Way ANOVA in Excel

Filed Under: ANOVA Tagged With: analysis example, Excel, interpreting results

How to do One-Way ANOVA in Excel

By Jim Frost 23 Comments

Use one-way ANOVA to test whether the means of at least three groups are different. Excel refers to this test as Single Factor ANOVA. This post is an excellent introduction to performing and interpreting a one-way ANOVA test even if Excel isn’t your primary statistical software package. [Read more…] about How to do One-Way ANOVA in Excel

Filed Under: ANOVA Tagged With: analysis example, Excel, interpreting results

Using Post Hoc Tests with ANOVA

By Jim Frost 125 Comments

Post hoc tests are an integral part of ANOVA. When you use ANOVA to test the equality of at least three group means, statistically significant results indicate that not all of the group means are equal. However, ANOVA results do not identify which particular differences between pairs of means are significant. Use post hoc tests to explore differences between multiple group means while controlling the experiment-wise error rate.

In this post, I’ll show you what post hoc analyses are, the critical benefits they provide, and help you choose the correct one for your study. Additionally, I’ll show why failure to control the experiment-wise error rate will cause you to have severe doubts about your results. [Read more…] about Using Post Hoc Tests with ANOVA

Filed Under: ANOVA Tagged With: analysis example, choosing analysis, conceptual, graphs, interpreting results

How F-tests work in Analysis of Variance (ANOVA)

By Jim Frost 47 Comments

Analysis of variance (ANOVA) uses F-tests to statistically assess the equality of means when you have three or more groups. In this post, I’ll answer several common questions about the F-test.

  • How do F-tests work?
  • Why do we analyze variances to test means?

I’ll use concepts and graphs to answer these questions about F-tests in the context of a one-way ANOVA example. I’ll use the same approach that I use to explain how t-tests work. If you need a primer on the basics, read my hypothesis testing overview.

To learn more about ANOVA tests, including the more complex forms, read my ANOVA Overview.

[Read more…] about How F-tests work in Analysis of Variance (ANOVA)

Filed Under: ANOVA Tagged With: conceptual, graphs, probability

Benefits of Welch’s ANOVA Compared to the Classic One-Way ANOVA

By Jim Frost 63 Comments

Welch’s ANOVA is an alternative to the traditional analysis of variance (ANOVA) and it offers some serious benefits. One-way analysis of variance determines whether differences between the means of at least three groups are statistically significant. For decades, introductory statistics classes have taught the classic Fishers one-way ANOVA that uses the F-test. It’s a standard statistical analysis, and you might think it’s pretty much set in stone by now. Surprise, there’s a significant change occurring in the world of one-way analysis of variance! [Read more…] about Benefits of Welch’s ANOVA Compared to the Classic One-Way ANOVA

Filed Under: ANOVA Tagged With: analysis example, assumptions, choosing analysis, conceptual, interpreting results

Multivariate ANOVA (MANOVA) Benefits and When to Use It

By Jim Frost 152 Comments

Multivariate ANOVA (MANOVA) extends the capabilities of analysis of variance (ANOVA) by assessing multiple dependent variables simultaneously. ANOVA statistically tests the differences between three or more group means. For example, if you have three different teaching methods and you want to evaluate the average scores for these groups, you can use ANOVA. However, ANOVA does have a drawback. It can assess only one dependent variable at a time. This limitation can be an enormous problem in certain circumstances because it can prevent you from detecting effects that actually exist. [Read more…] about Multivariate ANOVA (MANOVA) Benefits and When to Use It

Filed Under: ANOVA Tagged With: analysis example, choosing analysis, conceptual, interpreting results

Repeated Measures Designs: Benefits and an ANOVA Example

By Jim Frost 24 Comments

Repeated measures designs, also known as a within-subjects designs, can seem like oddball experiments. When you think of a typical experiment, you probably picture an experimental design that uses mutually exclusive, independent groups. These experiments have a control group and treatment groups that have clear divisions between them. Each subject is in only one of these groups. [Read more…] about Repeated Measures Designs: Benefits and an ANOVA Example

Filed Under: ANOVA Tagged With: analysis example, conceptual, experimental design, interpreting results

Primary Sidebar

Meet Jim

I’ll help you intuitively understand statistics by focusing on concepts and using plain English so you can concentrate on understanding your results.

Read More...

Buy My Introduction to Statistics Book!

Cover of my Introduction to Statistics: An Intuitive Guide ebook.

Buy My Hypothesis Testing Book!

Cover image of my Hypothesis Testing: An Intuitive Guide ebook.

Buy My Regression Book!

Cover for my ebook, Regression Analysis: An Intuitive Guide for Using and Interpreting Linear Models.

Subscribe by Email

Enter your email address to receive notifications of new posts by email.

    I won't send you spam. Unsubscribe at any time.

    Follow Me

    • FacebookFacebook
    • RSS FeedRSS Feed
    • TwitterTwitter

    Top Posts

    • How to Interpret P-values and Coefficients in Regression Analysis
    • How To Interpret R-squared in Regression Analysis
    • How to do t-Tests in Excel
    • Multicollinearity in Regression Analysis: Problems, Detection, and Solutions
    • Z-table
    • Mean, Median, and Mode: Measures of Central Tendency
    • How to Find the P value: Process and Calculations
    • Understanding Interaction Effects in Statistics
    • How to Interpret the F-test of Overall Significance in Regression Analysis
    • One-Tailed and Two-Tailed Hypothesis Tests Explained

    Recent Posts

    • Using Scientific Notation
    • Selection Bias: Definition & Examples
    • ANCOVA: Uses, Assumptions & Example
    • Fibonacci Sequence: Formula & Uses
    • Undercoverage Bias: Definition & Examples
    • Matched Pairs Design: Uses & Examples

    Recent Comments

    • Morris on Validity in Research and Psychology: Types & Examples
    • Jim Frost on What are Robust Statistics?
    • Allan Fraser on What are Robust Statistics?
    • Steve on Survivorship Bias: Definition, Examples & Avoiding
    • Jim Frost on Using Post Hoc Tests with ANOVA

    Copyright © 2023 · Jim Frost · Privacy Policy